--- license: llama3.1 --- This model is "Built with Llama". It is based on [meta-llama/Meta-Llama-3.1-8B-Instruct](https://huggingface.co./meta-llama/Meta-Llama-3.1-8B-Instruct) and was created with the help of [mergekit](https://github.com/arcee-ai/mergekit). This is the mergekit configuration we used: [mergekit_moe_config.yml](https://huggingface.co./deutsche-telekom/Llama-3.1-MoE-8x8B-Instruct-raw/blob/main/mergekit_moe_config.yml) It should be noted that this model is the raw model after merging. It still has randomly initialized router networks and will not be better than a single one of its expert models. This model requires further training before use. ## Licensing This model is licensed under the Llama 3.1 Community License, Copyright (c) 2024 [Philip May](https://philipmay.org), [Deutsche Telekom AG](https://www.telekom.de/)\ Llama 3.1 is licensed under the Llama 3.1 Community License, Copyright (c) Meta Platforms, Inc. All Rights Reserved.