File size: 3,947 Bytes
f6f7ce1
 
6e925aa
 
 
 
 
 
 
 
b1f00b0
cee96fb
d778524
cee96fb
45db21f
cee96fb
b1f00b0
cee96fb
d778524
cee96fb
d778524
b0797b1
 
eba086f
 
 
957d99a
797084e
6fc7ce0
 
c512bd2
6fc7ce0
 
 
 
c512bd2
be841c5
6fc7ce0
 
 
 
 
 
be841c5
b17924e
6fc7ce0
 
f27df4b
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
6fc7ce0
 
 
 
 
 
 
 
 
b17924e
85f418b
 
 
b17924e
bbba376
b5ed3b2
f27df4b
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
bbba376
f27df4b
 
 
 
 
 
 
 
 
6fc7ce0
23a79dc
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
---
license: mit
datasets:
- tatsu-lab/alpaca
language:
- en
metrics:
- accuracy
library_name: transformers
pipeline_tag: text-generation
widget:
- text: "Below is an instruction that describes a task. Write a response that appropriately completes the request.\nInstruction: What is artificial intelligence?\nResponse: "
  example_title: "Knowledge-AI"
- text: "Below is an instruction that describes a task. Write a response that appropriately completes the request.\nInstruction: Write a haiku poem on cows\nResponse: "
  example_title: "Poem Generation"
- text: "Below is an instruction that describes a task. Write a response that appropriately completes the request.\nInstruction: What is the meaning of life?\nResponse: "
  example_title: "Philosophy"
- text: "Below is an instruction that describes a task. Write a response that appropriately completes the request.\nInstruction: What is the sky blue?\nResponse: "
  example_title: "Knowledge-sky"
- text: "Below is an instruction that describes a task. Write a response that appropriately completes the request.\nInstruction: Define electrostatic potential\nResponse: "
  example_title: "Knowledge-electricity"
inference:
  parameters:
    temperature: 0.7
    top_k: 50
    top_p: .9
    max_length: 200
---

# Model Card for Model ID
This is the fine tuned version of OpenAI's GPT-2 with Alpaca dataset.
This has been fine tuned for 20 epochs at batch size of 14.


### Model Description

```
license: mit
dataset: tatsu-lab/alpaca
language: en
library_name: transformers
pipeline_tag: text-generation
base_model: gpt2
```

## Examples

1. Give me tips to be healthy
```
Here are some tips to be healthy: 
- Eat a balanced diet that includes plenty of fruits, vegetables, whole grains, lean proteins, and healthy fats.
- Exercise regularly to maintain a healthy weight and reduce the risk of developing any health issues.
- Get enough sleep and stay hydrated by drinking plenty of water. 
- Get enough fiber and healthy fats to support your overall health and wellbeing. 
- Practice mindful breathing and avoid activities that can increase stress. 
- Get enough sleep and maintain a healthy weight.
```

2. Write a poem on cows
```
Cows are one of the most beloved animals in the world, 
They live in lush green fields and care for the ground. 
Their milk is a sweet beverage, 
They live in a world of peace and tranquility. 

Though cows may seem small in comparison, 
They have a special bond with their owners. 
They care for their cows, and they do it right. 
The milk they produce is vital, 
They have a special place in our hearts. 
They are loyal friends, and they will never be defeated. 
Cows are a source of joy and joy, 
They will never be out of place again.
```

3. Which is the fastest animal in the world
```
The cheetah is the fastest animal in the world.
```


## Bias, Risks, and Limitations

This model has all the biases, risks and limitations of base gpt2 model.


## Recommendation

The ideal format for inference is: 
```
Below is an instruction that describes a task. Write a response that appropriately completes the request.
Instruction:{question}
Response: 
```
Replace {question} with the question of your choice.

The parameters I used for inference are:
```
top_k=20
top_p=.9
temperature = .7
```


## References used 

1. GPT2
@article{radford2019language,
  title={Language Models are Unsupervised Multitask Learners},
  author={Radford, Alec and Wu, Jeff and Child, Rewon and Luan, David and Amodei, Dario and Sutskever, Ilya},
  year={2019}
}

2. tatsu-lab/alpaca
@misc{alpaca,
  author = {Rohan Taori and Ishaan Gulrajani and Tianyi Zhang and Yann Dubois and Xuechen Li and Carlos Guestrin and Percy Liang and Tatsunori B. Hashimoto },
  title = {Stanford Alpaca: An Instruction-following LLaMA model},
  year = {2023},
  publisher = {GitHub},
  journal = {GitHub repository},
  howpublished = {\url{https://github.com/tatsu-lab/stanford_alpaca}},
}