|
--- |
|
license: apache-2.0 |
|
datasets: |
|
- ajibawa-2023/General-Stories-Collection |
|
language: |
|
- en |
|
tags: |
|
- story |
|
- art |
|
- general audience |
|
- knowledge |
|
--- |
|
**General-Stories-Mistral-7B** |
|
|
|
This model is based on my dataset [General-Stories-Collection](https://huggingface.co./datasets/ajibawa-2023/General-Stories-Collection) which has **1.3 million** stories especially meant for General audience. |
|
|
|
After an extensive training period spanning over 15 days, this model has been meticulously honed to deliver captivating narratives with broad appeal. |
|
Leveraging a vast synthetic dataset comprising approximately **1.3 million** stories tailored for diverse readership, this model possesses a deep understanding of narrative intricacies and themes. |
|
What sets my model apart is not just its ability to generate stories, but its capacity to evoke emotion, spark imagination, and forge connections with its audience. |
|
|
|
I am excited to introduce this powerful tool, ready to spark imagination and entertain readers worldwide with its versatile storytelling capabilities. |
|
|
|
As we embark on this exciting journey of AI storytelling, I invite you to explore the endless possibilities my model has to offer. Whether you're a writer seeking inspiration, a reader in search of a captivating tale, or a creative mind eager to push the boundaries of storytelling, my model is here to inspire, entertain, and enrich your literary experience. |
|
|
|
Kindly note this is qLoRA version. |
|
|
|
|
|
**GGUF & Exllama** |
|
|
|
GGUF: [Link](https://huggingface.co./bartowski/General-Stories-Mistral-7B-GGUF) |
|
|
|
Exllama v2: [Link](https://huggingface.co./bartowski/General-Stories-Mistral-7B-exl2) |
|
|
|
|
|
Special Thanks to [Bartowski](https://huggingface.co./bartowski) for quantizing this model. |
|
|
|
|
|
|
|
**Training** |
|
|
|
Entire dataset was trained on 4 x A100 80GB. For 2 epoch, training took more than **15 Days**. Axolotl codebase was used for training purpose. Entire data is trained on Mistral-7B-v0.1. |
|
|
|
**Example Prompt:** |
|
|
|
This model uses **ChatML** prompt format. |
|
|
|
``` |
|
<|im_start|>system |
|
You are a Helpful Assistant.<|im_end|> |
|
<|im_start|>user |
|
{prompt}<|im_end|> |
|
<|im_start|>assistant |
|
|
|
``` |
|
You can modify above Prompt as per your requirement. |
|
|
|
|
|
I want to say special Thanks to the Open Source community for helping & guiding me to better understand the AI/Model development. |
|
|
|
Thank you for your love & support. |
|
|
|
**Example Output** |
|
|
|
Example 1 |
|
|
|
|
|
|
|
![image/jpeg](https://cdn-uploads.huggingface.co/production/uploads/64aea8ff67511bd3d965697b/mVLGRiYKzFCC2wAJOejLP.jpeg) |
|
|
|
|
|
|
|
Example 2 |
|
|
|
|
|
![image/jpeg](https://cdn-uploads.huggingface.co/production/uploads/64aea8ff67511bd3d965697b/FwCUW9FDDnmBpdnqraWNF.jpeg) |
|
|
|
|
|
|
|
Example 3 |
|
|
|
|
|
![image/jpeg](https://cdn-uploads.huggingface.co/production/uploads/64aea8ff67511bd3d965697b/w0D_eX3xG6MnX5wWD8LT9.jpeg) |
|
|
|
|
|
Example 4 |
|
|
|
|
|
![image/jpeg](https://cdn-uploads.huggingface.co/production/uploads/64aea8ff67511bd3d965697b/HaJ91YQ9d57SGv7BwTcv_.jpeg) |