File size: 1,067 Bytes
f66ff11 faef20e f66ff11 faef20e 44dadac 8783b54 44dadac 8783b54 f66ff11 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 |
---
base_model: unsloth/gemma-2-9b-bnb-4bit
language:
- en
license: apache-2.0
tags:
- text-generation-inference
- transformers
- unsloth
- gemma2
- trl
---
# Uploaded model
- **Developed by:** helixx999 (Harsh Jain)
- **License:** apache-2.0
- **Finetuned from model :** unsloth/gemma-2-9b-bnb-4bit
This is gemma2 trained on semeval restaurant data 2014 using unsloth framework.
Training Parameters:
per_device_train_batch_size = 2,
gradient_accumulation_steps = 4,
warmup_steps = 6, #Previous 5
max_steps = 60,
#learning_rate = 2e-4,
learning_rate = 1e-4,
fp16 = not is_bfloat16_supported(),
bf16 = is_bfloat16_supported(),
logging_steps = 1,
optim = "adamw_8bit",
weight_decay = 0.01,
lr_scheduler_type = "linear",
seed = 3407,
output_dir = "./tensorLog",
report_to="wandb"
[<img src="https://raw.githubusercontent.com/unslothai/unsloth/main/images/unsloth%20made%20with%20love.png" width="200"/>](https://github.com/unslothai/unsloth)
|