Concedo-llamacpp
This is a placeholder model used for a llamacpp powered KoboldAI API emulator by Concedo. This is NOT llama. Do not download or use this model directly.
It is required to ensure the API works correctly within the official KoboldAI client.
Check out https://github.com/LostRuins/llamacpp-for-kobold for more information.
All feedback and comments can be directed to Concedo on the KoboldAI discord.
- Downloads last month
- 285
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API:
The model authors have turned it off explicitly.