File size: 1,657 Bytes
421dc73
 
 
 
 
 
 
 
 
 
 
 
 
74f9805
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
421dc73
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
---
language:
- en
license: apache-2.0
tags:
- text-generation-inference
- transformers
- unsloth
- gemma
- trl
base_model: unsloth/gemma-1.1-2b-it-bnb-4bit
---

# Model Details:

- This model was created by finetuning the [unsloth/gemma-1.1-2b-it-bnb-4bit](https://huggingface.co./unsloth/gemma-1.1-2b-it-bnb-4bit) model using the [coedit dataset](https://huggingface.co./datasets/grammarly/coedit) from Grammarly.
- The finetuning was done following the fine-tuning notebook provided by Unsloth as a practice of finetuning using the coedit dataset.
- The model was finetuned using the prompt format of the gemma-2b-it model.
```
<start_of_turn>user
Fix grammar in this sentence: A notable number of Chinese factories make piratical products by copying foreign products.<end_of_turn>
<start_of_turn>model
A notable number of Chinese factories make pirated products by copying foreign products.<end_of_turn>
```
- The finetuning was done 2x faster by utilizing the Unsloth and Hugging Face's TRL library.

# Limitations:
The model was finetuned on a specific dataset (coedit) and may not generalize well to all Italian text generation tasks. Its performance may be limited compared to models trained on larger and more diverse datasets.

# Uploaded  model

- **Developed by:** gnokit
- **License:** apache-2.0
- **Finetuned from model :** unsloth/gemma-1.1-2b-it-bnb-4bit

This gemma model was trained 2x faster with [Unsloth](https://github.com/unslothai/unsloth) and Huggingface's TRL library.

[<img src="https://raw.githubusercontent.com/unslothai/unsloth/main/images/unsloth%20made%20with%20love.png" width="200"/>](https://github.com/unslothai/unsloth)