xxx777xxxASD commited on
Commit
17b966e
1 Parent(s): 25dfaaf

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +34 -0
README.md CHANGED
@@ -1,3 +1,37 @@
1
  ---
2
  license: apache-2.0
 
 
 
 
3
  ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
  ---
2
  license: apache-2.0
3
+ language:
4
+ - en
5
+ tags:
6
+ - merge
7
  ---
8
+
9
+ ![image/png](https://i.ibb.co/Qr4BYgc/1.png)
10
+
11
+ Test merge. Attempt to get good at RP, ERP, general things model with 128k context. Every model here has `Epiculous/Fett-uccine-Long-Noodle-7B-120k-Context` in merge instead of regular MistralYarn 128k. The reason is because i belive Epiculous merged it with Mistral Instruct v0.2 to make first 32k context as good as possible, if not than it's sad.
12
+
13
+
14
+ Here is the "family tree" of this model, im not writing full model names cause they long af
15
+ ### NeuralKunoichi-EroSumika 4x7B
16
+ ```
17
+ * NeuralKunoichi-EroSumika 4x7B
18
+ *(1) Kunocchini-7b-128k
19
+ |
20
+ *(2) Mistral-Instruct-v0.2-128k
21
+ * Mistral-7B-Instruct-v0.2
22
+ |
23
+ * Fett-128k
24
+ |
25
+ *(3) Erosumika-128k
26
+ * FErosumika 7B
27
+ |
28
+ * FFett-128k
29
+ |
30
+ *(4) Mistral-NeuralHuman-128k
31
+ * Fett-128k
32
+ |
33
+ * Mistral-NeuralHuman
34
+ * Mistral_MoreHuman
35
+ |
36
+ * Mistral-Neural-Story
37
+ ```