'Deepseek-V2' model output mix language

#4
by gigascake - opened

What is the issue?

  • May Be, multi-language not support problem.

I used '16b-lite-chat-f16' (31GB) model.

This model inference speed more than before vesion model.

But, Sometimes, output response context in chinese and mix request language.

and then continues output mix language.

this situation. using RAG, document upload, and then Querying, occur just Problem.

image.png

Check,

Plz. the solution.

Os : Fedora 39
GPU : nvidia A4000 * 4
CPU : amd threadripper 7980x

Thanks. always.

OS
Linux

GPU
Nvidia

CPU
AMD

Ollama version
0.1.40

deepseek-ai/deepseek-coder-1.3b-instruct é o melhor modelo CODER SLM que existe, por favor precisamos de uma versão DeepSeek-Coder-1.1B_V2-instruct (pré-treinado) com certeza será o melhor modelo coder SLM do mundo!

Sign up or log in to comment