qwerrwe / src /axolotl /monkeypatch

Commit History

fix sdp attention to use the flash/mem-efficient context manaager
a032c9f

winglian commited on

Fixed pre-commit problems, fixed small bug in logging_config to handle LOG_LEVEL env var
b1f4f7a

theobjectivedad commited on

Fix set mem_id for inference and refactor
974dc00

Nanobit commited on

Clean up landmark patching
a6190c8

Nanobit commited on

Refactor landmark attention patch
919727b

Nanobit commited on

add support to extend context with xpos rope
a03a7d7

winglian commited on

Fix grad checkpoint and outputs param
2a801b0

Nanobit commited on

Feat: Add landmark attention
55b8542

Nanobit commited on

don't worry about dupes
c56818b

winglian commited on

Update src/axolotl/monkeypatch/llama_attn_hijack_xformers.py
1076bcb
unverified

winglian Nanobit commited on

Update src/axolotl/monkeypatch/llama_attn_hijack_xformers.py
2daa683
unverified

winglian Nanobit commited on

black formatting
ad0ea6a

winglian commited on

copy xformers attn from ooba since we removed dep on alpaca_lora_4bit
6cb2310

winglian commited on