Commit History

Merge pull request #159 from AngainorDev/patch-1
8e568bb
unverified

Nanobit commited on

Merge pull request #194 from NanoCode012/fix/config-path
e21dab4
unverified

Nanobit commited on

Fix config path after config moved
52cde69

Nanobit commited on

config fixes
9a58e99

winglian commited on

add typehints
c7dee56

winglian commited on

add new sharegpt, refactor prompt so it can be customized later, add exception if no data is processed
aac4b76

winglian commited on

Merge pull request #191 from OpenAccess-AI-Collective/NanoCode012-patch-1
f31a338
unverified

Nanobit commited on

Add save_steps and eval_steps to Readme
4cd1dee
unverified

Nanobit commited on

Merge pull request #190 from OpenAccess-AI-Collective/fixes-20230711-v2
9ac16ed
unverified

winglian commited on

forgot to add this file
6b3f509

winglian commited on

gptq lora llama is obviously good
336aa3f

winglian commited on

update openllama and clean up paths
d0d7eaa

winglian commited on

fix table formatting
a6ebf57

winglian commited on

more matrix updates
280832c

winglian commited on

update the support matrix
a43bae9

winglian commited on

more pruning
effbbf6

winglian commited on

add check for attr
c9a149f

winglian commited on

more config pruning and migrating
c530e4b

winglian commited on

Merge pull request #189 from OpenAccess-AI-Collective/fixes-20230711
f620706
unverified

winglian commited on

get rid of some configs, formalize pythioa lora config
77762a5

winglian commited on

new validation for mpt w grad checkpoints
14668fa

winglian commited on

Fix strict and Lint
b565ecf

Angainor commited on

match up gradient checkpointing when using lora w config
fe0b768

winglian commited on

Merge pull request #186 from akj2018/main
e944311
unverified

Nanobit commited on

Update FAQS.md
e3e7b52
unverified

Akj2023 commited on

Fix set mem_id for inference and refactor
974dc00

Nanobit commited on

Set mem cache args on inference
572d114

Nanobit commited on

Clean up landmark patching
a6190c8

Nanobit commited on

Fix undefined LlamaForCausalLM and del try except
563b6d8

Nanobit commited on

peft no longer needs device_map
cd0a6f6

winglian commited on

Update FAQS.md
dd7d16d
unverified

Akj2023 commited on

Refactor landmark attention patch
919727b

Nanobit commited on

Update FAQS.md
5ffefee
unverified

Akj2023 commited on

Merge pull request #183 from OpenAccess-AI-Collective/inference-from-stdin
d9f713e
unverified

winglian commited on

fix formatting
958da70

winglian commited on

pass a prompt in from stdin for inference
c4e4f81

winglian commited on

Fix missing cfg.
a808bf9
unverified

Angainor Development commited on

Merge pull request #182 from OpenAccess-AI-Collective/fix-llama-ref
0124825
unverified

winglian commited on

Update scripts/finetune.py
759e867
unverified

winglian Nanobit commited on

address PR feedback
0c6f928

winglian commited on

add streaming dataset support for pretraining datasets
eea2731

winglian commited on

linting fix
1db46a9

winglian commited on

more gpt-neox long ctx fixes
ab5cd28

winglian commited on

fix bettertransformers save, force it to skip after saving correctly in callback
1a82082

winglian commited on

more tweaks to do pre-training with bettertransformers
1210dc8

winglian commited on

experimental expansion of ctx len
488a67d

winglian commited on

add validation/warning for bettertransformers and torch version
71a43f8

winglian commited on

use pythia-12b, neox-20b is flaky
3961902

winglian commited on