Error while loading model
#16
by
imjunaidafzal
- opened
I'm trying to run provided code in google-colab
but get the following error.
Code
from transformers import AutoModelForCausalLM, AutoTokenizer
model_id = "mistralai/Mixtral-8x7B-v0.1"
tokenizer = AutoTokenizer.from_pretrained(model_id)
model = AutoModelForCausalLM.from_pretrained(model_id)
text = "Hello my name is"
inputs = tokenizer(text, return_tensors="pt")
outputs = model.generate(**inputs, max_new_tokens=20)
print(tokenizer.decode(outputs[0], skip_special_tokens=True))
Error
<ipython-input-2-047a4565ea74> in <cell line: 6>()
4 tokenizer = AutoTokenizer.from_pretrained(model_id)
5
----> 6 model = AutoModelForCausalLM.from_pretrained(model_id)
7
8 text = "Hello my name is"
2 frames
/usr/local/lib/python3.10/dist-packages/transformers/models/auto/configuration_auto.py in __getitem__(self, key)
759 return self._extra_content[key]
760 if key not in self._mapping:
--> 761 raise KeyError(key)
762 value = self._mapping[key]
763 module_name = model_type_to_module_name(key)
KeyError: 'mixtral'
I ran transformers version 4.35.2 and got the same error. This is because it is not in the config map, but if you get the latest version, the corresponding value will be there, so if you get the latest version, it will be resolved. The latest version at this point is 4.36
yes, pip install -U transformers
should solve the issue
Hi
@shaun-glass
, i have test it with 4.35.2
but get the same error.
i got the same error but i solve it with
pip install -U transformers
huggingface-hub-0.20.1 tokenizers-0.15.0 transformers-4.36.2