Update README.md
Browse files
README.md
CHANGED
@@ -15,8 +15,8 @@ This model is an extended version of a Mistral-based Large Language Model (LLM)
|
|
15 |
|
16 |
- **Base Model**: Mistral 7B based LLM
|
17 |
- **Tokenizer Extension**: Specifically extended for Turkish
|
18 |
-
- **Training Dataset**: Cleaned Turkish raw data with 5 billion tokens
|
19 |
-
- **Training Method**: Initially with DORA, followed by fine-tuning with LORA
|
20 |
|
21 |
### DORA Configuration
|
22 |
|
|
|
15 |
|
16 |
- **Base Model**: Mistral 7B based LLM
|
17 |
- **Tokenizer Extension**: Specifically extended for Turkish
|
18 |
+
- **Training Dataset**: Cleaned Turkish raw data with 5 billion tokens, custom Turkish instruction sets
|
19 |
+
- **Training Method**: Initially with DORA, followed by fine-tuning with LORA
|
20 |
|
21 |
### DORA Configuration
|
22 |
|