File size: 563 Bytes
6954e42
b75a0bc
 
 
 
 
6954e42
b75a0bc
6954e42
 
b75a0bc
0c37728
b75a0bc
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
---
license: mit
datasets:
- yahma/alpaca-cleaned
language:
- en
library_name: transformers
pipeline_tag: text-generation
---

# phi-2-alpaca-cleaned
This model is an instruction-tuned version of the instruction-tuned [microsoft/phi-2](https://huggingface.co./microsoft/phi-2) on the [yahma/alpaca-cleaned](https://huggingface.co./datasets/yahma/alpaca-cleaned) dataset.

## Training
- GPUs: 8 × A6000 48GB
- per_device_train_batch_size 8
- gradient_accumulation_steps 8
- per_device_eval_batch_size 8
- num_train_epochs 3
- learning_rate 2e-5
- warmup_ratio 0.03