"Using pipeline:" in the README doesn't work.(+ there is typo)

#4
by PerRing - opened
---> 12 outputs = pipe(image, prompt=prompt, generate_kwargs={"max_new_tokens": 200})
     13 print(outputs)

File /data/MLP/hgyoo/.hg/lib/python3.10/site-packages/transformers/pipelines/image_to_text.py:111, in ImageToTextPipeline.__call__(self, images, **kwargs)
     83 def __call__(self, images: Union[str, List[str], "Image.Image", List["Image.Image"]], **kwargs):
     84     """
     85     Assign labels to the image(s) passed as inputs.
     86 
   (...)
    109         - **generated_text** (`str`) -- The generated text.
    110     """
--> 111     return super().__call__(images, **kwargs)

File /data/MLP/hgyoo/.hg/lib/python3.10/site-packages/transformers/pipelines/base.py:1140, in Pipeline.__call__(self, inputs, num_workers, batch_size, *args, **kwargs)
   1132     return next(
   1133         iter(
   1134             self.get_iterator(
   (...)
   1137         )
   1138     )
...
--> 136     expanded_attn_mask = causal_4d_mask.masked_fill(expanded_attn_mask.bool(), torch.finfo(dtype).min)
    138 # expanded_attn_mask + causal_4d_mask can cause some overflow
    139 expanded_4d_mask = expanded_attn_mask

RuntimeError: The size of tensor a (616) must match the size of tensor b (1231) at non-singleton dimension 3

I ran the Using Pipeline code exactly as it is in the README, but I'm getting a dimension error.
Additionally, there is a typo (import request -> import requests)

Llava Hugging Face org

It has been recently fixed in transformers main can you try:

pip install -U git+https://github.com/huggingface/transformers.git

For the typo : thanks for noticing! I will update it

I just updated Transformers ( 4.36.0.dev0) with your code, but it still doesn't work. Is this not the latest version?
+) there is no 'processor=AutoProcessor.from_pretrained(model_id)' in Using pure transformers:

Llava Hugging Face org
edited Dec 9, 2023

Hi @PerRing !
Nice catch !
You might need to re-start the kernel in case you are using Gcolab, we just tried it and it seems to work fine: https://huggingface.co./llava-hf/llava-1.5-7b-hf/discussions/2#65748c1a244aefdfc48fbd83

it works after I create new environment.

PerRing changed discussion status to closed

Sign up or log in to comment