Overview

mistralai developed and released the Mistral-Nemo family of large language models (LLMs).

Variants

No Variant Cortex CLI command
2 gguf cortex run mistral-nemo:gguf
3 main/default cortex run mistral-nemo

Use it with Jan (UI)

  1. Install Jan using Quickstart
  2. Use in Jan model Hub:
    cortexso/mistral-nemo
    

Use it with Cortex (CLI)

  1. Install Cortex using Quickstart
  2. Run the model with command:
    cortex run mistral-nemo
    

Credits

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and HF Inference API was unable to determine this model's library.

Collection including cortexso/mistral-nemo