Hugging Face
Models
Datasets
Spaces
Posts
Docs
Solutions
Pricing
Log In
Sign Up
allknowingroger
/
Neuralmaath-12B-MoE
like
0
Text Generation
Transformers
Safetensors
mixtral
Mixture of Experts
frankenmoe
Merge
mergekit
lazymergekit
Kukedlc/NeuralSynthesis-7b-v0.4-slerp
DT12the/Math-Mixtral-7B
text-generation-inference
Inference Endpoints
License:
apache-2.0
Model card
Files
Files and versions
Community
Train
Deploy
Use this model
main
Neuralmaath-12B-MoE
/
mergekit_moe_config.yml
Commit History
Upload folder using huggingface_hub
473815d
verified
allknowingroger
commited on
Apr 15