File size: 755 Bytes
f6f7ce1
 
6e925aa
 
 
 
 
 
 
 
797084e
c512bd2
 
 
 
c5235ce
be841c5
0f4c55b
 
 
be841c5
b17924e
 
b5ed3b2
b17924e
 
 
 
 
bbba376
b5ed3b2
 
bbba376
09fbe7a
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
---
license: mit
datasets:
- tatsu-lab/alpaca
language:
- en
metrics:
- accuracy
library_name: transformers
pipeline_tag: text-generation
---
# GPT-2 model finetuned with Alpaca dataset
This is the fine tuned version of OpenAI's GPT-2 with Alpaca dataset.
The model was trained for 15 epochs.

As of my tests, the best resuts were obtained with:
```
temperature:0.7
top_p:0.92
top_k: 0
```
But free feel to test different values for the same.

# The ideal format for inference is: 
```
"""Below is an instruction that describes a task. Write a response that appropriately completes the request.
 Instruction:{question}
 Response: """
```
Replace {question} with the question of your choice.

# Model Examples

### 1. Give me some tips for staying healthy