not working

#103
by snieunny - opened

I got this error :
Authorization header is correct, but the token seems invalid, even I change the API token , what is wrong

I am also getting this same error today(4/23/2024)

OSError: You are trying to access a gated repo.
401 Client Error
i have access also created new api still facing issue

If you already have accepted the terms and conditions at the 'Homepage', try passing the token also into the AutoTokenizer, not only on AutoModelForCausalLM:

AutoTokenizer.from_pretrained(model_id, token = '<your token>')

AutoModelForCausalLM.from_pretrained(model_id, token = '<your token>')

Now it working fine, i am using notebook_login

from huggingface_hub import notebook_login
notebook_login()

Hi, Same for me. I try tro use the model withe the pipeline, but it doesn’t work. It write me : "Cannot access gated repo for url https://huggingface.co./mistralai/Mistral-7B-Instruct-v0.2/resolve/main/config.json.
Access to model mistralai/Mistral-7B-Instruct-v0.2 is restricted. You must be authenticated to access it."
But i already have accept the term, and when i go on mistral hugging face page, it's write, "Gated model
You have been granted access to this model"
Is someone have a solution ?

I had the same error. I just regenerated the API key with "write" permissions, and it worked.

same error OSError: We couldn't connect to 'https://huggingface.co.' to load this file, couldn't find it in the cached files and it looks like mistralai/Mistral-7B-Instruct-v0.2 is not the path to a directory containing a file named config.json.
Checkout your internet connection or see how to run the library in offline mode at 'https://huggingface.co./docs/transformers/installation#offline-mode'.

What token? Where do we get this token everybody is talking about?

Create a token in a hugging face in the format of reading mode

This comment has been hidden

Sign up or log in to comment