ohashi56225's picture
Update README.md
2530f52 verified
|
raw
history blame
No virus
562 Bytes
metadata
license: mit
datasets:
  - yahma/alpaca-cleaned
language:
  - en
library_name: transformers
pipeline_tag: text-generation

phi-2-alpaca-cleaned

This model is an instruction-tuned version of the microsoft/phi-2 model fine-tuned on the yahma/alpaca-cleaned dataset.

Training

  • GPUs: 8 × A6000 48GB
  • per_device_train_batch_size 8
  • gradient_accumulation_steps 8
  • per_device_eval_batch_size 8
  • num_train_epochs 3
  • learning_rate 2e-5
  • warmup_ratio 0.03