File size: 2,841 Bytes
3adb7e9
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
---
license: apache-2.0
datasets:
- ajibawa-2023/Children-Stories-Collection
language:
- en
tags:
- story
- young children
- educational
- knowledge
---


**Young-Children-Storyteller-Mistral-7B**

This model is based on my dataset [Children-Stories-Collection](https://huggingface.co./datasets/ajibawa-2023/Children-Stories-Collection) which has over 0.9 million stories meant for Young Children (age 6 to 12).

Drawing upon synthetic datasets meticulously designed with the developmental needs of young children in mind, Young-Children-Storyteller is more than just a tool—it's a companion on the journey of discovery and learning. 
With its boundless storytelling capabilities, this model serves as a gateway to a universe brimming with wonder, adventure, and endless possibilities.

Whether it's embarking on a whimsical adventure with colorful characters, unraveling mysteries in far-off lands, or simply sharing moments of joy and laughter, Young-Children-Storyteller fosters a love for language and storytelling from the earliest of ages. 
Through interactive engagement and age-appropriate content, it nurtures creativity, empathy, and critical thinking skills, laying a foundation for lifelong learning and exploration.

Rooted in a vast repository of over 0.9 million specially curated stories tailored for young minds, Young-Children-Storyteller is poised to revolutionize the way children engage with language and storytelling.

Kindly note this is qLoRA version, another exception.


**GGUF & Exllama**

Standard Q_K & GGUF: [Link](https://huggingface.co./MarsupialAI/Young-Children-Storyteller-Mistral-7B_iMatrix_GGUF/tree/main)

Exllama: TBA

Special Thanks to [MarsupialAI](https://huggingface.co./MarsupialAI) for quantizing the model.

**Training**

Entire dataset was trained on 4 x A100 80GB. For 3 epoch, training took more than 30 Hours. Axolotl codebase was used for training purpose. Entire data is trained on Mistral-7B-v0.1.

**Example Prompt:**

This model uses **ChatML** prompt format.

```
<|im_start|>system
You are a Helpful Assistant who can write educational stories for Young Children.<|im_end|>
<|im_start|>user
{prompt}<|im_end|>
<|im_start|>assistant

```
You can modify above Prompt as per your requirement. 


I want to say special Thanks to the Open Source community for helping & guiding me to better understand the AI/Model development.

Thank you for your love & support.

**Example Output**

Example 1


![image/jpeg](https://cdn-uploads.huggingface.co/production/uploads/64aea8ff67511bd3d965697b/J48WYa1qmKnRaILA_44Ao.jpeg)



Example 2

![image/jpeg](https://cdn-uploads.huggingface.co/production/uploads/64aea8ff67511bd3d965697b/H2FucX0CTtV25wlgHmifN.jpeg)



Example 3

![image/jpeg](https://cdn-uploads.huggingface.co/production/uploads/64aea8ff67511bd3d965697b/o7hiMI5noO8fPedUG75H8.jpeg)