File size: 2,905 Bytes
61c873d
 
7bf787c
 
 
 
 
 
 
 
 
61c873d
1af2e5f
 
7bf787c
1af2e5f
7bf787c
 
b352bfc
1af2e5f
b352bfc
1af2e5f
b352bfc
1af2e5f
7bf787c
1af2e5f
 
 
 
a11a1b6
1af2e5f
a11a1b6
 
 
 
1af2e5f
 
 
 
 
47a335d
1af2e5f
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
7bf787c
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
---
license: apache-2.0
datasets:
- ajibawa-2023/General-Stories-Collection
language:
- en
tags:
- story
- art
- general audience
- knowledge
---
**General-Stories-Mistral-7B**

This model is based on my dataset [General-Stories-Collection](https://huggingface.co./datasets/ajibawa-2023/General-Stories-Collection) which has **1.3 million** stories especially meant for General audience.

After an extensive training period spanning over 15 days, this model has been meticulously honed to deliver captivating narratives with broad appeal. 
Leveraging a vast synthetic dataset comprising approximately **1.3 million** stories tailored for diverse readership, this model possesses a deep understanding of narrative intricacies and themes.
What sets my model apart is not just its ability to generate stories, but its capacity to evoke emotion, spark imagination, and forge connections with its audience.

I am excited to introduce this powerful tool, ready to spark imagination and entertain readers worldwide with its versatile storytelling capabilities.

As we embark on this exciting journey of AI storytelling, I invite you to explore the endless possibilities my model has to offer. Whether you're a writer seeking inspiration, a reader in search of a captivating tale, or a creative mind eager to push the boundaries of storytelling, my model is here to inspire, entertain, and enrich your literary experience.

Kindly note this is qLoRA version.


**GGUF & Exllama**

GGUF: [Link](https://huggingface.co./bartowski/General-Stories-Mistral-7B-GGUF)

Exllama v2: [Link](https://huggingface.co./bartowski/General-Stories-Mistral-7B-exl2)


Special Thanks to[Bartowski](https://huggingface.co./bartowski) for quantizing this model.



**Training**

Entire dataset was trained on 4 x A100 80GB. For 2 epoch, training took more than **15 Days**. Axolotl codebase was used for training purpose. Entire data is trained on Mistral-7B-v0.1.

**Example Prompt:**

This model uses **ChatML** prompt format.

```
<|im_start|>system
You are a Helpful Assistant.<|im_end|>
<|im_start|>user
{prompt}<|im_end|>
<|im_start|>assistant

```
You can modify above Prompt as per your requirement. 


I want to say special Thanks to the Open Source community for helping & guiding me to better understand the AI/Model development.

Thank you for your love & support.

**Example Output**

Example 1



![image/jpeg](https://cdn-uploads.huggingface.co/production/uploads/64aea8ff67511bd3d965697b/mVLGRiYKzFCC2wAJOejLP.jpeg)



Example 2


![image/jpeg](https://cdn-uploads.huggingface.co/production/uploads/64aea8ff67511bd3d965697b/FwCUW9FDDnmBpdnqraWNF.jpeg)



Example 3


![image/jpeg](https://cdn-uploads.huggingface.co/production/uploads/64aea8ff67511bd3d965697b/w0D_eX3xG6MnX5wWD8LT9.jpeg)


Example 4


![image/jpeg](https://cdn-uploads.huggingface.co/production/uploads/64aea8ff67511bd3d965697b/HaJ91YQ9d57SGv7BwTcv_.jpeg)