[AUTOMATED] Model Memory Requirements
#16 opened 10 months ago
by
model-sizer-bot
4x version
1
#15 opened 11 months ago
by
ehartford
![](https://cdn-avatars.huggingface.co/v1/production/uploads/63111b2d88942700629f5771/u2a9y-yx6TG0N31OhMSHI.png)
Adding Evaluation Results
#14 opened 12 months ago
by
leaderboard-pr-bot
![](https://cdn-avatars.huggingface.co/v1/production/uploads/655506df9dc61e22c5f9c732/IZGvup0FdVlioPPIPnzZv.jpeg)
What's the model architecture
#13 opened 12 months ago
by
JamesShao
base or chat model?
#12 opened about 1 year ago
by
horaceai
I am a newbie, how to use the existing open source LLM to train MoE. Thank you
#11 opened about 1 year ago
by
EEEmpty
Quantization Please
1
#9 opened about 1 year ago
by
bingw5
How many GPU memories that the MoE module needs?
2
#8 opened about 1 year ago
by
Jazzlee
Multi-langua?
1
#7 opened about 1 year ago
by
oFDz
![](https://cdn-avatars.huggingface.co/v1/production/uploads/noauth/EP1UsMD4_tc9cct2EIUYh.png)
Perfect MoE's my write up, and help to you for making MoE's
#6 opened about 1 year ago
by
rombodawg
![](https://cdn-avatars.huggingface.co/v1/production/uploads/642cc1c253e76b4c2286c58e/fGtQ_QeTjUgBhIT89dpUt.jpeg)
Add MOE (mixture of experts) tag
#5 opened about 1 year ago
by
davanstrien
![](https://cdn-avatars.huggingface.co/v1/production/uploads/1627505688463-60107b385ac3e86b3ea4fc34.jpeg)
What are the merging parameters?
3
#4 opened about 1 year ago
by
rombodawg
![](https://cdn-avatars.huggingface.co/v1/production/uploads/642cc1c253e76b4c2286c58e/fGtQ_QeTjUgBhIT89dpUt.jpeg)
is this base model or sft model?
1
#3 opened about 1 year ago
by
lucasjin
Can VLLM be used for inference acceleration?
2
#2 opened about 1 year ago
by
obtion
You are all three top spots on the leaderboard
#1 opened about 1 year ago
by
dillfrescott
![](https://cdn-avatars.huggingface.co/v1/production/uploads/6215ce9abfcb3893344dd0a2/Zl1qDoGUGZ-ob0PcY6JbT.jpeg)