File size: 1,767 Bytes
4703e1e
 
 
 
 
 
 
 
94e8f75
 
 
 
 
 
 
 
 
 
9153e2f
 
 
 
 
 
 
 
94e8f75
 
6509aee
 
81906ca
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
6509aee
 
81906ca
 
6509aee
 
 
 
94e8f75
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
---
license: mit
datasets:
- sinarashidi/alpaca-persian
language:
- en
- fa
library_name: transformers
---

# Maral 7B Alpha 1

<p align="center">
  <img src="maral-7b-announce.png" width=256 height=256 />
</p>

## What is Maral?

_Maral_ is just a new large lanugage model, specializing on the Persian language. This model is based on [Mistral](https://huggingface.co./mistralai/Mistral-7B-v0.1) and trained an _Alpaca Persian_ dataset. This model is one of the few efforts in Persian speaking scene in order to bring our language to a new life in the era of AI.

Also, since Maral is based on Mistral, it's capable of producing English answers as well. 

### What does "Maral" mean?

Maral is the Persian name of [Red Deer](https://en.wikipedia.org/wiki/Red_deer), which is a native species of deers in Iran. The name has chosen for quite a few reasons, one of them is that the environmental concerns we have and second, since it's a Persian LLM, made by Iranian people, it deserves an Iranian name.

## Inference

### Prompt Format

This model requires _Guanaco_ format, which is like this:

```
### Human: <prompt>
### Assistant: <answer>
```

So in your code, you may write prompts like this:

```python
prompt = "در سال ۱۹۹۶ چه کسی رییس جمهور آمریکا بود؟"
prompt = f"### Human:{prompt}\n### Assistant:"
```

More information about this on the inference sections. 

### 4 bit Quantization

If you want to use 4 bit quantization, we have a PEFT for you [here](https://huggingface.co./MaralGPT/MaralGPT-Mistral-7B-v-0-1). Also, you can find _Google Colab_ notebooks [here](https://github.com/prp-e/maralgpt).

### Inference on a big GPU

### Inference on a small GPU (Consumer Hardware/Free Colab)

## Known Issues

## Special Thanks