Llama-3.1-5B-Instruct-18Layers / mergekit_config.yml
kechengcode's picture
Upload folder using huggingface_hub
e8947a4 verified
raw
history blame contribute delete
No virus
218 Bytes
dtype: float16
merge_method: passthrough
slices:
- sources:
- layer_range: [0, 6]
model: meta-llama/Meta-Llama-3.1-8B-Instruct
- sources:
- layer_range: [20, 32]
model: meta-llama/Meta-Llama-3.1-8B-Instruct