mia naomi
Update README.md
f44bace
---
language:
- en
datasets:
- The Pile
- super_glue
tags:
- pytorch
- causal-lm
- 8bit
inference: false
---
side project, not final working good product
see [this repo](https://huggingface.co/crumb/gpt-j-6b-shakespeare) for more information, finetuned with [8-bit Adam](https://arxiv.org/abs/2110.02861) on custom transformed super glue tasks, transformed somewhat like the [T0 paper](https://arxiv.org/abs/2110.08207)
```python
from IPython import display
!pip install transformers==4.14.1 -q
!pip install bitsandbytes-cuda111==0.26.0 -q
!pip install git+https://github.com/aicrumb/transformers-8bit -q
import transformers_8bit
model, tokenizer, config = transformers_8bit.gptj("crumb/gpt-j-6b-finetune-super-glue", device='cuda')
prompt = tokenizer("<QUERY> If birds are in group B, and snakes are in group A, what group are pythons in? <RESPONSE>", return_tensors='pt')
prompt = {key: value.to('cuda') for key, value in prompt.items()}
length = len(prompt['input_ids'][0])
out = model.generate(**prompt, min_length=length+2, max_length=length+2, do_sample=False, pad_token_id=tokenizer.eos_token_id)
print(tokenizer.decode(out[0]))
"""output
<QUERY> If birds are in group B, and snakes are in group A, what group are pythons in? <RESPONSE> Group A
"""
```