Hugging Face
Models
Datasets
Spaces
Posts
Docs
Solutions
Pricing
Log In
Sign Up
allknowingroger
/
MultiverseMath-12B-MoE
like
0
Text Generation
Transformers
Safetensors
mixtral
Mixture of Experts
frankenmoe
Merge
mergekit
lazymergekit
allknowingroger/MultiverseEx26-7B-slerp
DT12the/Math-Mixtral-7B
text-generation-inference
Inference Endpoints
License:
apache-2.0
Model card
Files
Files and versions
Community
Train
Deploy
Use this model
main
MultiverseMath-12B-MoE
/
mergekit_moe_config.yml
Commit History
Upload folder using huggingface_hub
894bca0
verified
allknowingroger
commited on
Apr 15