Unable to load via transformers?
#19
by
gnumanth
- opened
# Use a pipeline as a high-level helper
from transformers import pipeline
pipe = pipeline("text-generation", model="google/gemma-7b-it")
Is resulting in a KeyError
on colab:
---------------------------------------------------------------------------
KeyError Traceback (most recent call last)
/usr/local/lib/python3.10/dist-packages/transformers/models/auto/configuration_auto.py in from_pretrained(cls, pretrained_model_name_or_path, **kwargs)
1116 try:
-> 1117 config_class = CONFIG_MAPPING[config_dict["model_type"]]
1118 except KeyError:
3 frames
KeyError: 'gemma'
During handling of the above exception, another exception occurred:
ValueError Traceback (most recent call last)
/usr/local/lib/python3.10/dist-packages/transformers/models/auto/configuration_auto.py in from_pretrained(cls, pretrained_model_name_or_path, **kwargs)
1117 config_class = CONFIG_MAPPING[config_dict["model_type"]]
1118 except KeyError:
-> 1119 raise ValueError(
1120 f"The checkpoint you are trying to load has model type `{config_dict['model_type']}` "
1121 "but Transformers does not recognize this architecture. This could be because of an "
ValueError: The checkpoint you are trying to load has model type `gemma` but Transformers does not recognize this architecture. This could be because of an issue with the checkpoint, or because your version of Transformers is out of date.
Is there something obvious that I am missing?
Please make sure to pip install upgrade to the latest transformers version with pip install -U transformers
pip install -U "transformers==4.38.0" --upgrade
osanseviero
changed discussion status to
closed
I have upgraded the transformers library, but I am still getting the same error in the colab. Please help.
@msinghy make sure to use a fresh new environment (by restarting the kernel or creating a new notebook)
Oops, I forgot to do that. Thanks!
Thanks, upgrading and restarting the session worked.