stazizov commited on
Commit
a94a369
1 Parent(s): 7400763

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +9 -2
README.md CHANGED
@@ -41,8 +41,15 @@ To try our models, you have 2 options:
41
  2. Use our custom nodes for ComfyUI and test it with provided workflows (check out folder /workflows)
42
 
43
  ## Instruction for ComfyUI
44
- Put `clip_vision_l.safetensors` to `ComfyUI/models/clip_vision` and `ip_adapter.safetensors` to `ComfyUI/models/xlabs/ipadapters`
45
- Please, try our workflow "ip_adapter_workflow.json"
 
 
 
 
 
 
 
46
 
47
  ![Example Picture 2](assets/ip_adapter_example2.png?raw=true)
48
  ![Example Picture 1](assets/ip_adapter_example1.png?raw=true)
 
41
  2. Use our custom nodes for ComfyUI and test it with provided workflows (check out folder /workflows)
42
 
43
  ## Instruction for ComfyUI
44
+ 1. Update x-flux-comfy with `git pull` or reinstall it.
45
+ 2. Download Clip-L `model.safetensors` from [OpenAI VIT CLIP large](https://huggingface.co/openai/clip-vit-large-patch14), and put it to `ComfyUI/models/clip_vision/*`.
46
+ 3. Download our IPAdapter from [huggingface](https://huggingface.co/XLabs-AI/flux-ip-adapter/tree/main), and put it to `ComfyUI/models/xlabs/ipadapters/*`.
47
+ 4. Use `Flux Load IPAdapter` and `Apply Flux IPAdapter` nodes, choose right CLIP model and enjoy your genereations.
48
+ 5. You can find example workflow in folder workflows in this repo.
49
+
50
+ ### Limitations
51
+ The IP Adapter is currently in beta.
52
+ We do not guarantee that you will get a good result right away, it may ta
53
 
54
  ![Example Picture 2](assets/ip_adapter_example2.png?raw=true)
55
  ![Example Picture 1](assets/ip_adapter_example1.png?raw=true)