Batch inference

#59
by epishchik - opened

Does Phi3V support batch inference? I want to process N questions for N images (1 question for 1 image) in parallel in a single forward method on inference.

Really Need, Too...

Plus for this.
I encountered TypeError when I tried to implement batch inference. The docstrings states text argument can take list[str] but there's no case to handle it in processing_phi3_v.py

Hope to get this feature soon.

Sign up or log in to comment