Text Generation Inference

#4
by jarekmor - opened

Hi!
Great work!

Is it possible to have the Bielik model on the "HF Text Generation Inference" the Supported Models and Hardware list?

SpeakLeash | Spichlerz org

It should work with TGI in optimal way. The model architecture of Bielik is on the list.

It should work with TGI in optimal way. The model architecture of Bielik is on the list.

Thank you for the response.
Yes, I know that the Bielik is based on the Mistral architecture. I should admit I meant compatibility with TGI Messages API which is compatibile with OpenAI Chat Completion API (https://huggingface.co./docs/text-generation-inference/messages_api).

When I run the following script:

from openai import OpenAI

# init the client but point it to TGI
client = OpenAI(
    base_url="http://localhost:8080/v1",
    api_key="-"
)
chat_completion = client.chat.completions.create(
    model="tgi",
    messages=[
        {"role": "user", "content": "What is deep learning?"}
    ],
    stream=False
)
print(chat_completion.choices[0].message.content)

I got the following traceback :

Traceback (most recent call last):
  File "/home/hp/Python_Projects/HF_OpenAI/test.py", line 9, in <module>
    chat_completion = client.chat.completions.create(
  File "/home/hp/Python_Projects/HF_OpenAI/.venv/lib/python3.10/site-packages/openai/_utils/_utils.py", line 275, in wrapper
    return func(*args, **kwargs)
  File "/home/hp/Python_Projects/HF_OpenAI/.venv/lib/python3.10/site-packages/openai/resources/chat/completions.py", line 667, in create
    return self._post(
  File "/home/hp/Python_Projects/HF_OpenAI/.venv/lib/python3.10/site-packages/openai/_base_client.py", line 1233, in post
    return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls))
  File "/home/hp/Python_Projects/HF_OpenAI/.venv/lib/python3.10/site-packages/openai/_base_client.py", line 922, in request
    return self._request(
  File "/home/hp/Python_Projects/HF_OpenAI/.venv/lib/python3.10/site-packages/openai/_base_client.py", line 1013, in _request
    raise self._make_status_error_from_response(err.response) from None
openai.UnprocessableEntityError: Error code: 422 - {'error': 'Template error: invalid operation: object has no method named strip (in <string>:1)', 'error_type': 'template_error'}

The Mistral model works without any issues.

Sign up or log in to comment