--- license: apache-2.0 tags: - merge - mergekit - dare_ties - "7B" - "eren23/OGNO-7b-dpo-truthful" - "Kquant03/NeuralTrix-7B-dpo-laser" --- # MergedAI-7B-DARE-0 A merged model using Dynamic Adaptive Region Ensemble (DARE) with TIES sign election step using [mergekit](https://github.com/cg123/mergekit). ## Model Details - **Base Models**: * [eren23/OGNO-7b-dpo-truthful](https://huggingface.co./eren23/OGNO-7b-dpo-truthful) * [Kquant03/NeuralTrix-7B-dpo-laser](https://huggingface.co./Kquant03/NeuralTrix-7B-dpo-laser) - **Merge Method**: dare_ties ## Configuration ```yaml models: - model: eren23/OGNO-7b-dpo-truthful - model: Kquant03/NeuralTrix-7B-dpo-laser parameters: density: 0.53 weight: 0.95 merge_method: dare_ties base_model: eren23/OGNO-7b-dpo-truthful parameters: int8_mask: true dtype: bfloat16 ``` ## Usage This model can be used with the standard transformers library: ```python from transformers import AutoModelForCausalLM, AutoTokenizer model = AutoModelForCausalLM.from_pretrained("Ro-xe/MergedAI-7B-DARE-0") tokenizer = AutoTokenizer.from_pretrained("Ro-xe/MergedAI-7B-DARE-0") ```