LLaMa-3-8B-First-8-Layers / mergekit_config.yml
chargoddard's picture
Upload folder using huggingface_hub
3eefb97 verified
merge_method: passthrough
dtype: bfloat16
slices:
- sources:
- layer_range: [0, 8]
model: NousResearch/Meta-Llama-3-8B