qwerrwe / src /axolotl /monkeypatch

Commit History

add noisy embedding (#721)
3bd9528
unverified

Maxime Maxime commited on

flash_attention + sample packing for stablelm 3b (#671)
2d60ba3
unverified

winglian commited on

fix for flash attn w mistral w/o sammple packing (#648)
b2edaae
unverified

winglian commited on

Mistral flash attn packing (#646)
b6ab8aa
unverified

winglian commited on

skip some flash attn patches unless explicitly enabled (#643)
895f0a0
unverified

winglian commited on

use fastchat conversations template (#578)
e7d3e2d
unverified

winglian commited on

update for recent transformers updates (#636)
60c7c48
unverified

winglian commited on

Feat: Add support for upstream FA2 (#626)
19a600a
unverified

Nanobit commited on

btlm and falcon monkey patches for flash attn (#566)
6b9b229
unverified

winglian commited on

Add training callback to send predictions to WandB table (#521)
5b67ea9
unverified

Glavin001 commited on

reorg a bit
fc8766e

tmm1 commited on

use flash_attn rmsnorm when available (#526)
72a6fe1
unverified

tmm1 commited on

use flash_attn xentropy when available (#525)
5fe30b1
unverified

tmm1 commited on

fix checkpints on multigpu (#481)
31f3e71
unverified

winglian commited on

ReLoRA implementation (with quantization) (#322)
bde3c5a
unverified

chargoddard winglian commited on

is_causal fix for evals?
fbf49a4

winglian commited on

fix evals (#447)
ee26281
unverified

winglian commited on

standardize attn hijack patches (#381)
06edf17
unverified

tmm1 winglian commited on

fix check for flash attn branching (#377)
343ac84
unverified

winglian commited on

Attention mask and position id fixes for packing (#285)
2bb0b78
unverified

winglian commited on

Update XFormers Attention Monkeypatch to handle Llama-2 70B (GQA) (#339)
10405b9
unverified

ssmi153 commited on

move flash-attn monkey patch alongside the others
312a9fa

tmm1 commited on

fix sdp attention to use the flash/mem-efficient context manaager
a032c9f

winglian commited on

Fixed pre-commit problems, fixed small bug in logging_config to handle LOG_LEVEL env var
b1f4f7a

theobjectivedad commited on

Fix set mem_id for inference and refactor
974dc00

Nanobit commited on

Clean up landmark patching
a6190c8

Nanobit commited on

Refactor landmark attention patch
919727b

Nanobit commited on

add support to extend context with xpos rope
a03a7d7

winglian commited on

Fix grad checkpoint and outputs param
2a801b0

Nanobit commited on

Feat: Add landmark attention
55b8542

Nanobit commited on

don't worry about dupes
c56818b

winglian commited on

Update src/axolotl/monkeypatch/llama_attn_hijack_xformers.py
1076bcb
unverified

winglian Nanobit commited on

Update src/axolotl/monkeypatch/llama_attn_hijack_xformers.py
2daa683
unverified

winglian Nanobit commited on

black formatting
ad0ea6a

winglian commited on

copy xformers attn from ooba since we removed dep on alpaca_lora_4bit
6cb2310

winglian commited on