YAML Metadata Warning: empty or missing yaml metadata in repo card (https://huggingface.co./docs/hub/model-cards#model-card-metadata)

Quantization made by Richard Erkhov.

Github

Discord

Request more models

gemma-2b-translation-v0.103 - bnb 4bits

Original model description:

library_name: transformers language: - ko license: gemma tags: - gemma - pytorch - instruct - finetune - translation widget: - messages: - role: user content: "Hamsters don't eat cats." inference: parameters: max_new_tokens: 2048 base_model: beomi/gemma-ko-2b datasets: - traintogpb/aihub-flores-koen-integrated-sparta-30k pipeline_tag: text-generation

Gemma 2B Translation v0.103

  • Eval Loss: 1.34507
  • Train Loss: 1.40326
  • lr: 3e-05
  • optimizer: adamw
  • lr_scheduler_type: cosine

Prompt Template

<bos>### English

Hamsters don't eat cats.

### Korean

頄勳姢韯半姅 瓿犾枒鞚措ゼ 毹轨 鞎婌姷雼堧嫟.<eos>

Model Description

Downloads last month
84
Safetensors
Model size
1.55B params
Tensor type
F32
FP16
U8
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.