YAML Metadata
Warning:
empty or missing yaml metadata in repo card
(https://huggingface.co./docs/hub/model-cards#model-card-metadata)
This model is based on a custom Transformer model that can be installed with:
pip install git+https://github.com/lucadiliello/bleurt-pytorch.git
Now load the model and make predictions with:
import torch
from bleurt_pytorch import BleurtConfig, BleurtForSequenceClassification, BleurtTokenizer
config = BleurtConfig.from_pretrained('lucadiliello/bleurt-large-128')
model = BleurtForSequenceClassification.from_pretrained('lucadiliello/bleurt-large-128')
tokenizer = BleurtTokenizer.from_pretrained('lucadiliello/bleurt-large-128')
references = ["a bird chirps by the window", "this is a random sentence"]
candidates = ["a bird chirps by the window", "this looks like a random sentence"]
model.eval()
with torch.no_grad():
inputs = tokenizer(references, candidates, padding='longest', return_tensors='pt')
res = model(**inputs).logits.flatten().tolist()
print(res)
# [1.088831901550293, 0.8341825604438782]
Take a look at this repository for the definition of BleurtConfig
, BleurtForSequenceClassification
and BleurtTokenizer
in PyTorch.
- Downloads last month
- 31
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.