poor performance for DeepSeek-V3-AWQ

#9
by fridayl - opened

image.png
unrelated content will generate
use command
python3 -m vllm.entrypoints.openai.api_server --host 0.0.0.0 --port 9000 --max-model-len 65536 --max-num-batched-tokens 65536 --trust-remote-code --tensor-parallel-size 8 --gpu-memory-utilization 0.97 --dtype float16 --served-model-name deepseek-chat --model /models/DeepSeek-V3-awq
temperature=0.6, top_p=0.67

我这边也是,性能和官网的能力差距还有点大,甚至有些表现不如之前部署的V2.5

Cognitive Computations org

Please try R1.

Sign up or log in to comment