|
--- |
|
base_model: |
|
- ifable/gemma-2-Ifable-9B |
|
- BAAI/Gemma2-9B-IT-Simpo-Infinity-Preference |
|
library_name: transformers |
|
tags: |
|
- merge |
|
- mergekit |
|
language: |
|
- en |
|
--- |
|
|
|
# Gemma 2 Aeria Infinity 9B |
|
|
|
Gemma 2 Aeria Infinity 9B is a merge of the following models using [Mergekit](https://github.com/arcee-ai/mergekit): |
|
* [ifable/gemma-2-Ifable-9B](https://huggingface.co./ifable/gemma-2-Ifable-9B) |
|
* [BAAI/Gemma2-9B-IT-Simpo-Infinity-Preference](https://huggingface.co./BAAI/Gemma2-9B-IT-Simpo-Infinity-Preference) |
|
|
|
## 🧩 Configuration |
|
|
|
```yaml |
|
base_model: ifable/gemma-2-Ifable-9B |
|
models: |
|
- model: ifable/gemma-2-Ifable-9B |
|
# No parameters necessary for base model |
|
- model: BAAI/Gemma2-9B-IT-Simpo-Infinity-Preference |
|
parameters: |
|
density: 0.5 |
|
weight: 1 |
|
merge_method: dare_ties |
|
parameters: |
|
int8_mask: true |
|
dtype: bfloat16 |
|
``` |
|
|
|
## 💻 Usage |
|
|
|
```python |
|
!pip install -qU transformers accelerate |
|
|
|
from transformers import AutoTokenizer |
|
import transformers |
|
import torch |
|
|
|
model = "AELLM/gemma-2-aeria-infinity-9b" |
|
messages = [{"role": "user", "content": "You're a chef AI in a kitchen that serves alien species. What would you cook, and how do you handle bizarre ingredient requests?"}] |
|
|
|
tokenizer = AutoTokenizer.from_pretrained(model) |
|
prompt = tokenizer.apply_chat_template(messages, tokenize=False, add_generation_prompt=True) |
|
pipeline = transformers.pipeline( |
|
"text-generation", |
|
model=model, |
|
torch_dtype=torch.float16, |
|
device_map="auto", |
|
) |
|
|
|
outputs = pipeline(prompt, max_new_tokens=256, do_sample=True, temperature=0.7, top_k=50, top_p=0.95) |
|
print(outputs[0]["generated_text"]) |
|
``` |