Failed to load Qwen2-VL using transformers built version(60226fd)
#7
by
Zoey1024
- opened
Following the doc tutorial, install transformers with latest git commit:
pip install git+https://github.com/huggingface/transformers
and the model loading step:
from transformers import Qwen2VLForConditionalGeneration
model = Qwen2VLForConditionalGeneration.from_pretrained(
pretrained_model_name_or_path="Qwen/Qwen2-VL-2B-Instruct",
torch_dtype="auto",
device_map="auto",
)
raise errors:
File "/home/xxx/xxx/qwen2.py", line 1, in <module>
from transformers import Qwen2VLForConditionalGeneration
File "<frozen importlib._bootstrap>", line 1039, in _handle_fromlist
File "/home/xxx/xxx/.venv/lib/python3.8/site-packages/transformers/utils/import_utils.py", line 1657, in __getattr__
value = getattr(module, name)
File "/home/xxx/xxx/.venv/lib/python3.8/site-packages/transformers/utils/import_utils.py", line 1656, in __getattr__
module = self._get_module(self._class_to_module[name])
File "/home/xxx/xxx/.venv/lib/python3.8/site-packages/transformers/utils/import_utils.py", line 1668, in _get_module
raise RuntimeError(
RuntimeError: Failed to import transformers.models.qwen2_vl.modeling_qwen2_vl because of the following error (look up to see its traceback):
Failed to import transformers.generation.utils because of the following error (look up to see its traceback):
issubclass() arg 1 must be a class
Is there a issue with latest transformer version(60226fd) for Qwen2-VL integration?
问题解决了没,哥
not yet
Could you provide more environment information?
You can run:
python -m torch.utils.collect_env
transformers-cli env