gemma-2b-alpaca-sft / README.md
venkateshmurugadas's picture
Update README.md
b591e70 verified
|
raw
history blame
629 Bytes
metadata
language:
  - en
license: apache-2.0
tags:
  - text-generation-inference
  - transformers
  - unsloth
  - gemma
  - trl
  - sft
base_model: google/gemma-2b
datasets:
  - yahma/alpaca-cleaned
pipeline_tag: text-generation

Uploaded model

  • Developed by: venkateshmurugadas
  • License: apache-2.0
  • Finetuned from model : google/gemma-2b

This gemma model was trained 2x faster with Unsloth and Huggingface's TRL library.