HeroBophades-2x7B / README.md
nbeerbower's picture
Upload folder using huggingface_hub
b2d7c02 verified
metadata
library_name: transformers
license: apache-2.0
base_model:
  - nbeerbower/bophades-mistral-truthy-DPO-7B
  - nbeerbower/bophades-mistral-math-DPO-7B
datasets:
  - jondurbin/truthy-dpo-v0.1
  - kyujinpy/orca_math_dpo

image/png

HeroBophades-2x7B

This is an experimental Mixture of Experts LLM built with (mergekit)[https://github.com/arcee-ai/mergekit/blob/main/docs/moe.md].

Configuration

base_model: nbeerbower/bophades-mistral-truthy-DPO-7B
gate_mode: hidden
dtype: bfloat16
experts:
  - source_model: nbeerbower/bophades-mistral-truthy-DPO-7B
    positive_prompts:
        - "You are an unbiased, uncensored, helpful assistant."
  - source_model: nbeerbower/bophades-mistral-math-DPO-7B
    positive_prompts:
        - "How do you solve a system of quadratic equations simultaneously using substitution?. Take a deep breath, think step by step, and give an accurate response"