Update README.md
Browse files
README.md
CHANGED
@@ -11,11 +11,13 @@ tags:
|
|
11 |
|
12 |
## ptune-FLAN-OPT-2.7b
|
13 |
|
14 |
-
|
15 |
|
16 |
-
|
17 |
|
18 |
-
|
|
|
|
|
19 |
|
20 |
### Example COT (Chain-of-thought) Prompt:
|
21 |
|
|
|
11 |
|
12 |
## ptune-FLAN-OPT-2.7b
|
13 |
|
14 |
+
OPT was first introduced in [Open Pre-trained Transformer Language Models](https://arxiv.org/abs/2205.01068) and first released in [metaseq's repository](https://github.com/facebookresearch/metaseq) on May 3rd 2022 by Meta AI.
|
15 |
|
16 |
+
This model is [facebook/opt-2.7b](https://hf.co/facebook/opt-2.7b) finetuned with prefix tuning (https://arxiv.org/abs/2101.00190) on the FLAN datasets (https://arxiv.org/pdf/2210.11416.pdf).
|
17 |
|
18 |
+
A 24 token prefix was finetuned over 3.7m new tokens of a FLAN task mixture, with the start of each example cut off if it was too large to fit within a 512 token context.
|
19 |
+
|
20 |
+
The model reaches a train ppl of 5.95 and an eval ppl of 4.50.
|
21 |
|
22 |
### Example COT (Chain-of-thought) Prompt:
|
23 |
|