YAML Metadata
Warning:
empty or missing yaml metadata in repo card
(https://huggingface.co./docs/hub/model-cards#model-card-metadata)
Quantization made by Richard Erkhov.
gemma-2b-translation-v0.103 - bnb 4bits
- Model creator: https://huggingface.co./lemon-mint/
- Original model: https://huggingface.co./lemon-mint/gemma-2b-translation-v0.103/
Original model description:
library_name: transformers language: - ko license: gemma tags: - gemma - pytorch - instruct - finetune - translation widget: - messages: - role: user content: "Hamsters don't eat cats." inference: parameters: max_new_tokens: 2048 base_model: beomi/gemma-ko-2b datasets: - traintogpb/aihub-flores-koen-integrated-sparta-30k pipeline_tag: text-generation
Gemma 2B Translation v0.103
- Eval Loss:
1.34507
- Train Loss:
1.40326
- lr:
3e-05
- optimizer: adamw
- lr_scheduler_type: cosine
Prompt Template
<bos>### English
Hamsters don't eat cats.
### Korean
頄勳姢韯半姅 瓿犾枒鞚措ゼ 毹轨 鞎婌姷雼堧嫟.<eos>
Model Description
- Developed by:
lemon-mint
- Model type: Gemma
- Language(s) (NLP): English
- License: gemma-terms-of-use
- Finetuned from model: beomi/gemma-ko-2b
- Downloads last month
- 84
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.