MarsupialAI commited on
Commit
372b04b
1 Parent(s): e7c62e3

Create README.md

Browse files
Files changed (1) hide show
  1. README.md +54 -0
README.md ADDED
@@ -0,0 +1,54 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: other
3
+ language:
4
+ - en
5
+ tags:
6
+ - rp
7
+ - erp
8
+ - chat
9
+ - storywriting
10
+ - not-for-all-audiences
11
+ - nsfw
12
+ - rotating-stack-merge
13
+ ---
14
+ # La Dame Blanche 103b
15
+
16
+ ![image/jpeg](https://cdn-uploads.huggingface.co/production/uploads/65a531bc7ec6af0f95c707b1/aO6n3uOtyUllpwefLMapa.jpeg)
17
+
18
+ This model is a rotating-stack merge of three Miqu-based models in a 103b (120 layer) configuration. The result of
19
+ this "frankenmerge" is a large model that contains what I consider to be the best of the spicy Miqu finetunes.
20
+
21
+ Component models for the rotating stack are
22
+ - NeverSleep/MiquMaid-v2-70B-DPO
23
+ - sophosympatheia/Midnight-Miqu-70B-v1.5
24
+ - cookinai/OrcaHermes-Mistral-70B-miqu
25
+
26
+
27
+ As all components are Miqu-based, coherency out to 32k context appears to be good.
28
+
29
+ This model is uncensored and perfectly capable of generating objectionable material. It does not seem to have a
30
+ propensity to insert NSFW content into SFW prompts, but YMMV. As with any LLM, no factual claims
31
+ made by the model should be taken at face value. You know that boilerplate safety disclaimer that most professional models have?
32
+ Assume this has it too. This model is for entertainment purposes only.
33
+
34
+ Vanilla GGUFs:
35
+ iMatrix GGUFs:
36
+
37
+ If you create additional quants, please let me know and I will link them here as well.
38
+
39
+
40
+ # Sample output
41
+
42
+ ```
43
+ {{[INPUT]}}
44
+ Write a detailed and humorous story about a cute and fluffy bunny that goes to a Gwar concert.
45
+ {{[OUTPUT]}}
46
+
47
+ ```
48
+
49
+
50
+ # WTF is a rotating-stack merge?
51
+ Inspired by Undi's experiments with stacked merges, Jeb Carter found that output quality and model initiative could be significantly
52
+ improved by reversing the model order in the stack, and then doing a linear merge between the original and reversed stacks. That is
53
+ what I did here. I created three passthrough stacked merges using the three source models (rotating the model order in each stack),
54
+ and then doing a linear merge of all three stacks. The exact merge configs can be found in the recipe.txt file.