File size: 1,987 Bytes
372b04b
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
---
license: other
language:
- en
tags:
- rp
- erp
- chat
- storywriting
- not-for-all-audiences
- nsfw
- rotating-stack-merge
---
# La Dame Blanche 103b

![image/jpeg](https://cdn-uploads.huggingface.co/production/uploads/65a531bc7ec6af0f95c707b1/aO6n3uOtyUllpwefLMapa.jpeg)

This model is a rotating-stack merge of three Miqu-based models in a 103b (120 layer) configuration.  The result of 
this "frankenmerge" is a large model that contains what I consider to be the best of the spicy Miqu finetunes.

Component models for the rotating stack are 
- NeverSleep/MiquMaid-v2-70B-DPO
- sophosympatheia/Midnight-Miqu-70B-v1.5
- cookinai/OrcaHermes-Mistral-70B-miqu


As all components are Miqu-based, coherency out to 32k context appears to be good.

This model is uncensored and perfectly capable of generating objectionable material.  It does not seem to have a 
propensity to insert NSFW content into SFW prompts, but YMMV.  As with any LLM, no factual claims 
made by the model should be taken at face value.  You know that boilerplate safety disclaimer that most professional models have?  
Assume this has it too.  This model is for entertainment purposes only. 

Vanilla GGUFs:  
iMatrix GGUFs:

If you create additional quants, please let me know and I will link them here as well.


# Sample output

```
{{[INPUT]}}
Write a detailed and humorous story about a cute and fluffy bunny that goes to a Gwar concert.
{{[OUTPUT]}}

```


# WTF is a rotating-stack merge?
Inspired by Undi's experiments with stacked merges, Jeb Carter found that output quality and model initiative could be significantly 
improved by reversing the model order in the stack, and then doing a linear merge between the original and reversed stacks.  That is 
what I did here.  I created three passthrough stacked merges using the three source models (rotating the model order in each stack), 
and then doing a linear merge of all three stacks.  The exact merge configs can be found in the recipe.txt file.