Update README.md
Browse files
README.md
CHANGED
@@ -3,8 +3,6 @@ license: apache-2.0
|
|
3 |
tags:
|
4 |
- moe
|
5 |
- merge
|
6 |
-
- mergekit
|
7 |
-
- lazymergekit
|
8 |
- deepseek-ai/deepseek-coder-6.7b-instruct
|
9 |
- ise-uiuc/Magicoder-S-CL-7B
|
10 |
- WizardLM/WizardMath-7B-V1.0
|
@@ -13,7 +11,7 @@ tags:
|
|
13 |
|
14 |
# Magician-MoE-4x7B
|
15 |
|
16 |
-
Magician-MoE-4x7B is a Mixure of Experts (MoE) made with the following models
|
17 |
* [deepseek-ai/deepseek-coder-6.7b-instruct](https://huggingface.co/deepseek-ai/deepseek-coder-6.7b-instruct)
|
18 |
* [ise-uiuc/Magicoder-S-CL-7B](https://huggingface.co/ise-uiuc/Magicoder-S-CL-7B)
|
19 |
* [WizardLM/WizardMath-7B-V1.0](https://huggingface.co/WizardLM/WizardMath-7B-V1.0)
|
|
|
3 |
tags:
|
4 |
- moe
|
5 |
- merge
|
|
|
|
|
6 |
- deepseek-ai/deepseek-coder-6.7b-instruct
|
7 |
- ise-uiuc/Magicoder-S-CL-7B
|
8 |
- WizardLM/WizardMath-7B-V1.0
|
|
|
11 |
|
12 |
# Magician-MoE-4x7B
|
13 |
|
14 |
+
Magician-MoE-4x7B is a Mixure of Experts (MoE) made with the following models:
|
15 |
* [deepseek-ai/deepseek-coder-6.7b-instruct](https://huggingface.co/deepseek-ai/deepseek-coder-6.7b-instruct)
|
16 |
* [ise-uiuc/Magicoder-S-CL-7B](https://huggingface.co/ise-uiuc/Magicoder-S-CL-7B)
|
17 |
* [WizardLM/WizardMath-7B-V1.0](https://huggingface.co/WizardLM/WizardMath-7B-V1.0)
|