prithivMLmods
commited on
Update README.md
Browse files
README.md
CHANGED
@@ -18,7 +18,7 @@ tags:
|
|
18 |
|
19 |
![smollmv1.png](https://cdn-uploads.huggingface.co/production/uploads/65bb837dbfb878f46c77de4c/7XZC42GuwcehV28lUI-8g.png)
|
20 |
|
21 |
-
# **
|
22 |
|
23 |
SmolLM2 is a family of compact language models available in three size: 135M, 360M, and 1.7B parameters. They are capable of solving a wide range of tasks while being lightweight enough to run on-device. Fine-tuning a language model like SmolLM involves several steps, from setting up the environment to training the model and saving the results. Below is a detailed step-by-step guide based on the provided notebook file
|
24 |
|
|
|
18 |
|
19 |
![smollmv1.png](https://cdn-uploads.huggingface.co/production/uploads/65bb837dbfb878f46c77de4c/7XZC42GuwcehV28lUI-8g.png)
|
20 |
|
21 |
+
# **SMOLLM CoT 360M ON CUSTOM SYNTHETIC DATA**
|
22 |
|
23 |
SmolLM2 is a family of compact language models available in three size: 135M, 360M, and 1.7B parameters. They are capable of solving a wide range of tasks while being lightweight enough to run on-device. Fine-tuning a language model like SmolLM involves several steps, from setting up the environment to training the model and saving the results. Below is a detailed step-by-step guide based on the provided notebook file
|
24 |
|