Support Mosaic's MPT models

#3
by tomaarsen HF staff - opened

Hello!

This leaderboard is looking great! Personally, I'd love to see Mosaic's new MPT models on there, e.g. mpt-7b-instruct. However, these models require trust_remote_code=True.

In particular, this goes wrong in the is_model_on_hub function, which returns False:
https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard/blob/main/app.py#L136-L144

It does so because the following exception will be thrown and then caught by the except clause:

ValueError: Loading mosaicml/mpt-7b-instruct requires you to execute the configuration file in that repo on your local machine. Make sure you have read the code there to avoid malicious use, then set the option `trust_remote_code=True` to remove this error.

Perhaps it would be possible to hardcode an exception for this trustworthy model.

  • Tom Aarsen
Open LLM Leaderboard org

the mosaicml/mpt-7b model is in the leaderboard now

thomwolf changed discussion status to closed
Open LLM Leaderboard org

Hi @tomaarsen !
We now support all MPT models, feel free to add as many as you want :)

Awesome, thank you!

Sign up or log in to comment