Yoko-7B-Japanese-v0 / README.md
ganchengguang's picture
Update README.md
7f1cb91
|
raw
history blame
No virus
611 Bytes
metadata
license: mit
language:
  - ja
  - en
  - zh
tags:
  - LLaMA2
  - Japanese
  - LLM

This model is traned with guanaco dataset. And this model only used by 49000 chat sample.
Improved performance in Chinese and Japanese.
Use the QLoRA to fine-tune the vanilla LLaMA2-7B.
And you can use test.py to test the model.

Recommend Generation parameters:

  • temperature: 0.5~0.7
  • top p: 0.65~1.0
  • top k: 30~50
  • repeat penalty: 1.03~1.17

Contribute by Yokohama Nationaly University Mori Lab.