LLaMA2-Accessory / config /code_13B_params.json
Enderfga's picture
Upload 4 files
1500a73
raw
history blame
163 Bytes
{
"dim": 5120,
"n_layers": 40,
"n_heads": 40,
"multiple_of": 256,
"ffn_dim_multiplier": 1.0,
"norm_eps": 1e-5,
"rope_theta": 1000000
}