File size: 1,270 Bytes
a212f62
 
 
 
9fde344
a212f62
9fde344
 
 
4c16b09
5cf66af
a212f62
 
f44bace
 
b9490e9
a212f62
5cd560f
 
 
 
 
 
 
 
a212f62
5cd560f
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
---
language: 
  - en
datasets:
- The Pile
- super_glue
tags:
- pytorch
- causal-lm
- 8bit
inference: false
---

side project, not final working good product

see [this repo](https://huggingface.co/crumb/gpt-j-6b-shakespeare) for more information, finetuned with [8-bit Adam](https://arxiv.org/abs/2110.02861) on custom transformed super glue tasks, transformed somewhat like the [T0 paper](https://arxiv.org/abs/2110.08207)

```python
from IPython import display 
!pip install transformers==4.14.1 -q
!pip install bitsandbytes-cuda111==0.26.0 -q
!pip install git+https://github.com/aicrumb/transformers-8bit -q

import transformers_8bit

model, tokenizer, config = transformers_8bit.gptj("crumb/gpt-j-6b-finetune-super-glue", device='cuda')

prompt = tokenizer("<QUERY> If birds are in group B, and snakes are in group A, what group are pythons in? <RESPONSE>", return_tensors='pt')
prompt = {key: value.to('cuda') for key, value in prompt.items()}
length = len(prompt['input_ids'][0])
out = model.generate(**prompt, min_length=length+2, max_length=length+2, do_sample=False, pad_token_id=tokenizer.eos_token_id)
print(tokenizer.decode(out[0]))

"""output
<QUERY> If birds are in group B, and snakes are in group A, what group are pythons in? <RESPONSE> Group A
"""
```