Ollama/llama.cpp support

#1
by dpreti - opened
Almawave org
β€’
edited 8 days ago

We observed that at the moment the model is not served correctly with Ollama and llama.cpp .

We are currently investigating the reasons behind this unexpected behavior.
In the meanwhile we strongly suggest to serve the model using vLLM or the Transformer library as showed in the model card.

dpreti changed discussion title from Ollama support to Ollama/llama.cpp support
Almawave org

Velvet-2B model has been released on the ollama library. It can be used directly with the command:

ollama run Almawave/velvet:2B

All the versions corresponding to different "tags" are available and can be found here:
https://ollama.com/Almawave/Velvet

dpreti changed discussion status to closed

Sign up or log in to comment