File size: 3,934 Bytes
743a687
 
 
 
 
 
 
 
 
 
 
 
 
70945ea
 
743a687
 
70945ea
743a687
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
---
inference: false
language:
- en
- de
- fr
- it
- es
- id
- ar
- ko
- no
- ru
- hi
- zh
library_name: transformers
license: apache-2.0
model_creator: LHC88
model_name: XPurpose-ClownCar-v0
model_type: Mixtral MoE
pipeline_tag: text-generation
prompt_template: '<|im_start|>system

  {system_message}<|im_end|>

  <|im_start|>user

  {prompt}<|im_end|>

  <|im_start|>assistant

  '
quantized_by: LHC
tags:
- mistral
- finetune
- dpo
- multi-language
- multi-purpose
- MoE
- Mixture-of-Experts
- mixtral
---
<!-- markdownlint-disable MD041 -->

<!-- header start -->
<!-- 200823 -->
<div style="display: flex; justify-content: space-between; width: 100%;">
    <div style="display: flex; flex-direction: column; align-items: flex-start;">
        <p style="margin-top: 0.5em; margin-bottom: 0em;"><a href="https://www.linkedin.com/in/lucas-h%C3%A4nke-de-cansino-8b8521234/">Chat & support: LHC's LinkedIn</a></p>
    </div>
    <div style="display: flex; flex-direction: column; align-items: flex-end;">
        <p style="margin-top: 0.5em; margin-bottom: 0em;"><a href="https://github.com/sponsors/l4b4r4b4b4">Want to contribute? LHC's Github Sponsors</a></p>
    </div>
</div>

<hr style="margin-top: 1.0em; margin-bottom: 1.0em;">
<!-- header end -->

<!-- description start -->
XPurpose-ClownCar-v0 is a multi-purpose MoE-model with the following expert configuration.

```yaml
base_model: openaccess-ai-collective/DPOpenHermes-7B
dtype: bfloat16
experts:
- positive_prompts:
  - instruction
  - solutions
  - chat
  - questions
  - comprehension
  source_model: teknium/OpenHermes-2.5-Mistral-7B
- negative_prompts:
  - chat
  - questions
  - python
  positive_prompts:
  - coding
  - programming
  - code
  - programming language
  source_model: codellama/CodeLlama-13b-hf
- negative_prompts:
  - chat
  - questions
  positive_prompts:
  - python
  - pip
  - coding
  - programming
  - code
  - programming language
  source_model: codellama/CodeLlama-13b-Python-hf
- negative_prompts:
  - chat
  - questions
  positive_prompts:
  - mathematics
  - optimization
  - step-by-step
  - science
  source_model: cognitivecomputations/dolphin-2.6-mistral-7b-dpo
- negative_prompts:
  - chat
  - questions
  positive_prompts:
  - bedtime story
  - Once upon a time
  - storytelling
  - narrator
  source_model: tom92119/llama-2-7b-bedtime-story
- negative_prompts:
  - chat
  - questions
  positive_prompts:
  - story
  - Once upon a time
  - storytelling
  - narrator
  source_model: Norquinal/Mistral-7B-storywriter
- negative_prompts:
  - chat
  - questions
  - instruction
  - solutions
  - chat
  - comprehension
  - mathematics
  - optimization
  - code
  - step-by-step
  - science
  positive_prompts:
  - function calls
  - functions
  - constrained grammar
  - API calls
  - LLM Tools
  source_model: meetkai/functionary-small-v2.2
- positive_prompts:
  - indonesian
  - indonesia
  source_model: azale-ai/Starstreak-7b-beta
- positive_prompts:
  - arabic
  - arab
  source_model: gagan3012/Mistral_arabic_dpo
- positive_prompts:
  - korean
  - korea
  source_model: davidkim205/komt-mistral-7b-v1
- positive_prompts:
  - chinese
  - china
  source_model: OpenBuddy/openbuddy-zephyr-7b-v14.1
- positive_prompts:
  - hindi
  - india
  source_model: manishiitg/open-aditi-hi-v1
- positive_prompts:
  - german
  - deutsch
  - Germany
  source_model: VAGOsolutions/SauerkrautLM-7b-v1-mistral
- positive_prompts:
  - Norway
  - Norwegian
  - Norsk
  source_model: bineric/NorskGPT-Mistral-7b
- positive_prompts:
  - Russian
  - Russia
  - "\u0420\u0443\u0441\u0441\u043A\u0438\u0439"
  - "\u0420\u043E\u0441\u0441\u0438\u044F"
  source_model: Droidfanat/llama-2-7b-custom-russian
gate_mode: hidden
```
<!-- description end -->

<!-- prompt-template start -->
## Prompt template: ChatML

```
<|im_start|>system
{system_message}<|im_end|>
<|im_start|>user
{prompt}<|im_end|>
<|im_start|>assistant

```

<!-- prompt-template end -->