File size: 770 Bytes
09c81e1 5eec0df 09c81e1 5eec0df 09c81e1 5eec0df 09c81e1 5eec0df 09c81e1 5eec0df 09c81e1 5eec0df |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 |
---
library_name: transformers
base_model:
- NousResearch/Hermes-2-Pro-Mistral-7B
datasets:
- jondurbin/gutenberg-dpo-v0.1
- nbeerbower/gutenberg2-dpo
license: apache-2.0
---
![image/png](https://huggingface.co/nbeerbower/Mistral-Small-Gutenberg-Doppel-22B/resolve/main/doppel-header?download=true)
# Hermes2-Gutenberg2-Mistral-7B
[NousResearch/Hermes-2-Pro-Mistral-7B](https://huggingface.co/NousResearch/Hermes-2-Pro-Mistral-7B) finetuned on [jondurbin/gutenberg-dpo-v0.1](https://huggingface.co/datasets/jondurbin/gutenberg-dpo-v0.1) and [nbeerbower/gutenberg2-dpo](https://huggingface.co/datasets/nbeerbower/gutenberg2-dpo).
### Method
[ORPO tuned](https://mlabonne.github.io/blog/posts/2024-04-19_Fine_tune_Llama_3_with_ORPO.html) with 2x RTX 3090 for 3 epochs. |