RaushanTurganbay HF staff commited on
Commit
96fcc4d
·
verified ·
1 Parent(s): e0b2691

Update pipeline example

Browse files
Files changed (1) hide show
  1. README.md +27 -2
README.md CHANGED
@@ -30,12 +30,37 @@ other versions on a task that interests you.
30
 
31
  ### How to use
32
 
33
- Here's the prompt template for this model:
 
 
34
  ```
35
  "<|im_start|>system\n<your_system_prompt_here><|im_end|><|im_start|>user\n<image>\n<your_text_prompt_here><|im_end|><|im_start|>assistant\n"
36
  ```
37
 
38
- You can load and use the model like following:
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
39
  ```python
40
  from transformers import LlavaNextProcessor, LlavaNextForConditionalGeneration
41
  import torch
 
30
 
31
  ### How to use
32
 
33
+ Here's the prompt template for this model but we recommend to use chat templates to format the prompt with `processor.apply_chat_template()`.
34
+ That will apply the correct template for a given checkpoint for you.
35
+
36
  ```
37
  "<|im_start|>system\n<your_system_prompt_here><|im_end|><|im_start|>user\n<image>\n<your_text_prompt_here><|im_end|><|im_start|>assistant\n"
38
  ```
39
 
40
+ To run the model with the `pipeline`, see the below example:
41
+
42
+ ```python
43
+ from transformers import pipeline
44
+
45
+ pipe = pipeline("image-text-to-text", model="llava-hf/llava-v1.6-34b-hf")
46
+ messages = [
47
+ {
48
+ "role": "user",
49
+ "content": [
50
+ {"type": "image", "url": "https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/transformers/tasks/ai2d-demo.jpg"},
51
+ {"type": "text", "text": "What does the label 15 represent? (1) lava (2) core (3) tunnel (4) ash cloud"},
52
+ ],
53
+ },
54
+ ]
55
+
56
+ out = pipe(text=messages, max_new_tokens=20)
57
+ print(out)
58
+ >>> [{'input_text': [{'role': 'user', 'content': [{'type': 'image', 'url': 'https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/transformers/tasks/ai2d-demo.jpg'}, {'type': 'text', 'text': 'What does the label 15 represent? (1) lava (2) core (3) tunnel (4) ash cloud'}]}], 'generated_text': 'Lava'}]
59
+ ```
60
+
61
+
62
+ You can also load and use the model like following:
63
+
64
  ```python
65
  from transformers import LlavaNextProcessor, LlavaNextForConditionalGeneration
66
  import torch