|
--- |
|
license: apache-2.0 |
|
language: |
|
- en |
|
- zh |
|
base_model: |
|
- meta-llama/Meta-Llama-3.1-8B-Instruct |
|
datasets: |
|
- BAAI/IndustryInstruction_Finance-Economics |
|
- BAAI/IndustryInstruction |
|
--- |
|
|
|
This model is finetuned on the model llama3.1-8b-instruct using the dataset [BAAI/IndustryInstruction_Finance-Economics](https://huggingface.co/datasets/BAAI/IndustryInstruction_Finance-Economics) dataset, the dataset details can jump to the repo: [BAAI/IndustryInstruction](https://huggingface.co/datasets/BAAI/IndustryInstruction) |
|
|
|
## training params |
|
``` |
|
learning_rate=1e-5 |
|
lr_scheduler_type=cosine |
|
max_length=2048 |
|
warmup_ratio=0.05 |
|
batch_size=64 |
|
epoch=10 |
|
``` |
|
|
|
select best ckpt by the evaluation loss |
|
## evaluation |
|
|
|
The following is an evaluation on the FinerBen dataset metrci. Since there are too many samples in the dataset, I randomly selected 500 samples from each dataset for evaluation. |
|
|
|
![image/png](https://cdn-uploads.huggingface.co/production/uploads/642f6c64f945a8a5c9ee5b5d/shSgSkQ7nQqiBAl6IwBy5.png) |