SummaCoz LoRA Adapter for flan-t5-xxl
The model provides summarization factual consistency classificaiton and explanations.
Model Details
- Paper: SummaCoz: A Dataset for Improving the Interpretability of Factual Consistency Detection for Summarization
- Dataset:
nkwbtb/SummaCoz
Model Usage
from transformers import AutoModelForSeq2SeqLM, AutoTokenizer, pipeline
tokenizer = AutoTokenizer.from_pretrained("google/flan-t5-xxl")
model = AutoModelForSeq2SeqLM.from_pretrained("nkwbtb/flan-t5-11b-SummaCoz",
torch_dtype="auto",
device_map="auto")
pipe = pipeline("text2text-generation",
model=model,
tokenizer=tokenizer)
PROMPT = """Is the hypothesis true based on the premise? Give your explanation afterwards.
Premise:
{article}
Hypothesis:
{summary}
"""
article = "Goldfish are being caught weighing up to 2kg and koi carp up to 8kg and one metre in length."
summary = "Goldfish are being caught weighing up to 8kg and one metre in length."
print(pipe(PROMPT.format(article=article, summary=summary),
do_sample=False,
max_new_tokens=512))
"""[{'generated_text': '\
No, the hypothesis is not true. \
- The hypothesis states that goldfish are being caught weighing up to 8kg and one metre in length. \
- However, the premise states that goldfish are being caught weighing up to 2kg and koi carp up to 8kg and one metre in length. \
- The difference between the two is that the koi carp is weighing 8kg and the goldfish is weighing 2kg.'}]"""
Citation [optional]
BibTeX:
[More Information Needed]
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.
Model tree for nkwbtb/flan-t5-11b-SummaCoz
Base model
nkwbtb/flan-t5-xxl-bf16