Error in Colab

#2
by sudhir2016 - opened

Getting this error in Colab.
--> 761 raise KeyError(key)
762 value = self._mapping[key]
763 module_name = model_type_to_module_name(key)

KeyError: 'llava'

Llava Hugging Face org
β€’
edited Dec 9, 2023

Hi,

Make sure to have Transformers installed from the main branch: pip install git+https://github.com/huggingface/transformers@main.

Tried it. Now it gives this error.
PyTorch SDPA requirements in Transformers are not met. Please install torch>=2.1.1.

Error persists even after installing torch >=2-1.1

Llava Hugging Face org

Okay I'll have a look this is not expected

Llava Hugging Face org
β€’
edited Dec 9, 2023

:hug:

Llava Hugging Face org

I could not reproduce on main:


In [1]: from transformers import pipeline
   ...: from PIL import Image
   ...: import requests
   ...: 
   ...: 
   ...: model_id = "llava-hf/bakLlava-v1-hf"
   ...: pipe = pipeline("image-to-text", model=model_id)
   ...: url = "https://huggingface.co./datasets/huggingface/documentation-images/resolve/main/transformers/tasks/ai2d-demo.jpg"
   ...: 
   ...: image = Image.open(requests.get(url, stream=True).raw)
   ...: prompt = "USER: <image>\nWhat does the label 15 represent? (1) lava (2) core (3) tunnel (4) ash cloud\nASSISTANT:"
   ...: 
   ...: outputs = pipe(image, prompt=prompt, generate_kwargs={"max_new_tokens": 200})
   ...: print(outputs)
Loading checkpoint shards: 100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 4/4 [00:01<00:00,  2.29it/s]
Special tokens have been added in the vocabulary, make sure the associated word embeddings are fine-tuned or trained.
[{'generated_text': 'USER:  \nWhat does the label 15 represent? (1) lava (2) core (3) tunnel (4) ash cloud\nASSISTANT: Lava'}]

In [2]: import torch

In [3]: torch.__version__
Out[3]: '1.13.1'

Thanks. I was trying the llava-hf/llava-1.5-7b-hf model and not llava-hf/bakLlava-v1-hf. I was trying the 4 bit quantized version as per the example Colab provided on the model page. Issue remains.

Llava Hugging Face org

If you have that issue, it means that you don't have the main version of Transformers installed in your environment (which you can verify by doing pip show transformers). Make sure to have v4.36.dev

Llava Hugging Face org

@sudhir2016 can you try to upgrade torch? pip install -U torch

Many thanks to the incredible support provided by the awesome Hugging Face team !!. It works now. pip install -U torch did the trick.

sudhir2016 changed discussion status to closed

Sign up or log in to comment