Phi-3-small doesn't load with TGI
#24
by
aveer30
- opened
Hi, Inference still fails with TGI for Phi-3-small.
File "/opt/conda/lib/python3.10/site-packages/text_generation_server/server.py", line 220, in serve_inner
model = get_model(
File "/opt/conda/lib/python3.10/site-packages/text_generation_server/models/init.py", line 908, in get_model
raise ValueError(f"Unsupported model type {model_type}")
ValueError: Unsupported model type phi3small
Issue also raised here. Would really appreciate if any of you can take a look at this. Thanks.
https://github.com/huggingface/text-generation-inference/issues/1974
Hi,
I have exactly the same issue. Is there any update?