|
--- |
|
license: mit |
|
datasets: |
|
- tatsu-lab/alpaca |
|
language: |
|
- en |
|
metrics: |
|
- accuracy |
|
library_name: transformers |
|
pipeline_tag: text-generation |
|
--- |
|
# GPT-2 model finetuned with Alpaca dataset |
|
This is the fine tuned version of OpenAI's GPT-2 with Alpaca dataset. |
|
The model was trained for 15 epochs. |
|
|
|
As of my tests, the best resuts were obtained with: |
|
``` |
|
temperature:0.7 |
|
top_p:0.92 |
|
top_k: 0 |
|
``` |
|
But free feel to test different values for the same. |
|
|
|
# The ideal format for inference is: |
|
``` |
|
"""Below is an instruction that describes a task. Write a response that appropriately completes the request. |
|
Instruction:{question} |
|
Response: """ |
|
``` |
|
Replace {question} with the question of your choice. |
|
|
|
# Model Examples |
|
|
|
### 1. Give me some tips for staying healthy |
|
|