qwerrwe / src /axolotl /monkeypatch /llama_attn_hijack_xformers.py

Commit History

various bugfixes (#856)
1470650
unverified

winglian commited on

standardize attn hijack patches (#381)
06edf17
unverified

tmm1 winglian commited on

Attention mask and position id fixes for packing (#285)
2bb0b78
unverified

winglian commited on

Update XFormers Attention Monkeypatch to handle Llama-2 70B (GQA) (#339)
10405b9
unverified

ssmi153 commited on

fix sdp attention to use the flash/mem-efficient context manaager
a032c9f

winglian commited on

don't worry about dupes
c56818b

winglian commited on

Update src/axolotl/monkeypatch/llama_attn_hijack_xformers.py
1076bcb
unverified

winglian Nanobit commited on

Update src/axolotl/monkeypatch/llama_attn_hijack_xformers.py
2daa683
unverified

winglian Nanobit commited on

black formatting
ad0ea6a

winglian commited on

copy xformers attn from ooba since we removed dep on alpaca_lora_4bit
6cb2310

winglian commited on