GPT2-Alpaca / README.md
Arjun-G-Ravi's picture
Update README.md
23a79dc
|
raw
history blame
1.2 kB
metadata
license: mit
datasets:
  - tatsu-lab/alpaca
language:
  - en
metrics:
  - accuracy
library_name: transformers
pipeline_tag: text-generation

GPT-2 model finetuned with Alpaca dataset

This is the fine tuned version of OpenAI's GPT-2 with Alpaca dataset. The model was trained for 15 epochs.

As of my tests, the best resuts were obtained with:

temperature:0.7
top_p:0.92
top_k: 0

But free feel to test different values for the same.

The ideal format for inference is:

"""Below is an instruction that describes a task. Write a response that appropriately completes the request.
 Instruction:{question}
 Response: """

Replace {question} with the question of your choice.

Model Examples

1. Give me some tips for staying healthy

Stay hydrated and active’ while exercising. ’ Get enough sleep, exercise, and have a balanced meal plan.  ’Eat healthy and energy-packed snacks to stay active and healthy. ’Find the perfect combination of foods for your body, mind, and body. ’Preventative exercise can help you lose weight and improve metabolism. ’ Get enough sleep and stretch during the day. ’Eat healthy and have a balanced and energizing breakfast!