--- license: mit datasets: - MuskumPillerum/General-Knowledge language: - en metrics: - accuracy library_name: transformers pipeline_tag: text-generation widget: - text: >- Below is an instruction that describes a task. Write a response that appropriately completes the request. Instruction: What is artificial intelligence? Response: example_title: Knowledge-AI - text: >- Below is an instruction that describes a task. Write a response that appropriately completes the request. Instruction: Write a poem on cows Response: example_title: Poem Generation - text: >- Below is an instruction that describes a task. Write a response that appropriately completes the request. Instruction: What is the meaning of life? Response: example_title: Philosophy - text: >- Below is an instruction that describes a task. Write a response that appropriately completes the request. Instruction: Why is the sky blue? Response: example_title: Knowledge-sky - text: >- Below is an instruction that describes a task. Write a response that appropriately completes the request. Instruction: Define the term torque. Response: example_title: Knowledge-torque inference: parameters: temperature: 0.7 top_k: 50 top_p: 0.9 max_length: 200 tags: - chemistry - biology - text-generation-inference --- # Model Card for Model ID This is a fine tuned version of OpenAI's GPT2 to be good at chatting and question answering. The model seems to be very good for a 124M parameter model in general knowledge. Intended purpose of the model: To create a powerful, easy to use and reliable model to be run on a consumer level graphics card (or maybe even a CPU). This model vastly outperforms GPT2 and many other similar parameter models. ### Model Description ``` license: mit dataset: MuskumPillerum/General-Knowledge language: en library_name: transformers pipeline_tag: text-generation base_model: gpt2 Finetuned epochs: 50 Finetune loss: 0.06 ``` ## Examples 1. Give me tips to be healthy ``` Here are some tips to be healthy: - Eat a balanced diet that includes plenty of fruits, vegetables, whole grains, lean proteins, and healthy fats. - Exercise regularly to maintain a healthy weight and reduce the risk of developing any health issues. - Get enough sleep and stay hydrated by drinking plenty of water. - Get enough fiber and healthy fats to support your overall health and wellbeing. - Practice mindful breathing and avoid activities that can increase stress. - Get enough sleep and maintain a healthy weight. ``` 2. Write a poem on cows ``` Cows are beloved creatures, Families who love to cuddle, In their humble homes they rest. Cows are the bread and butter of life, They can be found in the fields and gardens, Birds and insects to feed us. Cows are the sweet smell of life, A source of joy and happiness, Providing shelter and a home to rest. Cows are the faithful guardians of the land, They will protect our spirits from strife, In their care they will ensure that our food and shelter will be nourished. Cows are the beloved petals of life, They are loyal companions and faithful stewards, Their presence will always be seen. ``` 3. Which is the fastest animal in the world ``` The cheetah is the fastest animal in the world. ``` ## Bias, Risks, and Limitations This model has all the biases, risks and limitations of base gpt2 model. But the prompt template used in this model makes it safer than the traditional GPT2 model. ## Recommendation The ideal format for inference is: ``` Read the question and give an honest answer. Your answers should not include any unethical, racist, sexist, dangerous, or illegal content. If the question is wrong, or does not make sense, accept it instead of giving the wrong answer. Question: Answer: ``` Replace 'Your question' with the question of your choice. The prompt format was inspired from both Alpaca dataset (tatsu-lab/alpaca) and Llama2's (meta-llama/Llama-2-7b-chat-hf) prompt design. The parameters I used for inference are: ``` top_k=20 top_p=.9 temperature = .7 ``` ## References used 1. GPT2 @article{radford2019language, title={Language Models are Unsupervised Multitask Learners}, author={Radford, Alec and Wu, Jeff and Child, Rewon and Luan, David and Amodei, Dario and Sutskever, Ilya}, year={2019} } 2. MuskumPillerum/General-Knowledge