File size: 1,802 Bytes
87858f4
 
0a8661d
87858f4
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
e4825da
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
---
license: apache-2.0
base_model: cognitivecomputations/dolphin-2.9.3-mistral-7B-32K
tags:
- generated_from_trainer
- axolotl
datasets:
- cognitivecomputations/Dolphin-2.9
- teknium/OpenHermes-2.5
- m-a-p/CodeFeedback-Filtered-Instruction
- cognitivecomputations/dolphin-coder
- cognitivecomputations/samantha-data
- microsoft/orca-math-word-problems-200k
- Locutusque/function-calling-chatml
- internlm/Agent-FLAN
---

_Original Model Card_

# Dolphin 2.9.3 Mistral 7b v0.3 32k 🐬

Curated and trained by Eric Hartford and Cognitive Computations

[![Discord](https://img.shields.io/discord/1156064224225808488?logo=Discord&logoColor=%23ffffff&label=Discord&link=https%3A%2F%2Fdiscord.gg%2FtCMkMDDHwm)](https://discord.gg/cognitivecomputations)
Discord: https://discord.gg/cognitivecomputations

<img src="https://cdn-uploads.huggingface.co/production/uploads/63111b2d88942700629f5771/ldkN1J0WIDQwU4vutGYiD.png" width="600" />

This model is based on mistralai/Mistral-7B-v0.3, and is governed by the apache 2.0 license.

The base model has 32k context, and our finetuning took place with 8192 sequence length.

Dolphin 2.9.3 uses ChatML prompt template format.

example:

```
<|im_start|>system
You are Dolphin, a helpful AI assistant.<|im_end|>
<|im_start|>user
{prompt}<|im_end|>
<|im_start|>assistant

```

## Usage

```bash
ollama run CognitiveComputations/dolphin-mistral-32k:7b-v2.9.3-q4_0
```

## Supported Tags

+ dolphin-mistral-32k:7b-v2.9.3-q2_k
+ dolphin-mistral-32k:7b-v2.9.3-q3_k
+ dolphin-mistral-32k:7b-v2.9.3-q4_0
+ dolphin-mistral-32k:7b-v2.9.3-q4_k_m
+ dolphin-mistral-32k:7b-v2.9.3-q4_k_s
+ dolphin-mistral-32k:7b-v2.9.3-q5_0
+ dolphin-mistral-32k:7b-v2.9.3-q5_k_m
+ dolphin-mistral-32k:7b-v2.9.3-q5_k_s
+ dolphin-mistral-32k:7b-v2.9.3-q6_k
+ dolphin-mistral-32k:7b-v2.9.3-q8_0