Text Generation
MLX
mistral
marcsun13 HF staff commited on
Commit
625bae0
1 Parent(s): a8d73c9

Update config.json

Browse files

rope_theta should not be in the config file. This is an arg used in llama mlx model

Files changed (1) hide show
  1. config.json +0 -1
config.json CHANGED
@@ -5,7 +5,6 @@
5
  "hidden_dim": 14336,
6
  "n_heads": 32,
7
  "n_kv_heads": 8,
8
- "rope_theta": 1000000.0,
9
  "norm_eps": 1e-05,
10
  "vocab_size": 32000,
11
  "quantization": {
 
5
  "hidden_dim": 14336,
6
  "n_heads": 32,
7
  "n_kv_heads": 8,
 
8
  "norm_eps": 1e-05,
9
  "vocab_size": 32000,
10
  "quantization": {