How to run this?

#3
by Autumnlight - opened

How can I run this? I usually run exllama2 or Koboldcpp but both dont seem to support this.

Hi, We have added instructions in the model card on running the model using vLLM or transformers.

Trying to run the model from transformers - AutoModelForCausalLM
I get an authentication error - Access to model ai21labs/AI21-Jamba-1.5-Mini is restricted. You must be authenticated to access it.

Can you please provide some authentication insturctions?

@OmerAtEasily you'll need to:

  1. accept the model terms in the Model card
  2. get an Access Token from your huggingface settings page, as explained here: https://huggingface.co./docs/hub/en/security-tokens
  3. pass the token to transformers (explained below)

There are several ways to let transformers use your token:

Sign up or log in to comment