Fork of salesforce/BLIP for a feature-extraction
task on 🤗Inference endpoint.
This repository implements a custom
task for feature-extraction
for 🤗 Inference Endpoints. The code for the customized pipeline is in the pipeline.py.
To use deploy this model a an Inference Endpoint you have to select Custom
as task to use the pipeline.py
file. -> double check if it is selected
expected Request payload
{
"inputs": "/9j/4AAQSkZJRgABAQEBLAEsAAD/2wBDAAMCAgICAgMC....", // base64 image as bytes
}
below is an example on how to run a request using Python and requests
.
Run Request
- prepare an image.
!wget https://huggingface.co./datasets/mishig/sample_images/resolve/main/palace.jpg
2.run request
import json
from typing import List
import requests as r
import base64
ENDPOINT_URL = "https://api-inference.huggingface.co/models/radames/blip_image_embeddings"
HF_TOKEN = ""
def predict(path_to_image: str = None):
with open(path_to_image, "rb") as i:
b64 = base64.b64encode(i.read())
payload = {"inputs": b64.decode("utf-8")}
response = r.post(
ENDPOINT_URL, headers={"X-Wait-For-Model": "true", "Authorization": f"Bearer {HF_TOKEN}"}, json=payload
)
return response.json()
prediction = predict(
path_to_image="palace.jpg"
)
expected output
[0.016450975090265274,
-0.5551009774208069,
0.39800673723220825,
-0.6809228658676147,
2.053842782974243,
-0.4712907075881958,...]
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.