Edit model card

SummaCoz LoRA Adapter for flan-t5-xxl

The model provides summarization factual consistency classificaiton and explanations.

Model Details

Model Usage

from transformers import AutoModelForSeq2SeqLM, AutoTokenizer, pipeline

tokenizer = AutoTokenizer.from_pretrained("google/flan-t5-xxl")
model = AutoModelForSeq2SeqLM.from_pretrained("nkwbtb/flan-t5-11b-SummaCoz",
                                              torch_dtype="auto",
                                              device_map="auto")
pipe = pipeline("text2text-generation", 
                model=model, 
                tokenizer=tokenizer)

PROMPT = """Is the hypothesis true based on the premise? Give your explanation afterwards.

Premise: 
{article}

Hypothesis:
{summary}
"""

article = "Goldfish are being caught weighing up to 2kg and koi carp up to 8kg and one metre in length."
summary = "Goldfish are being caught weighing up to 8kg and one metre in length."

print(pipe(PROMPT.format(article=article, summary=summary), 
           do_sample=False, 
           max_new_tokens=512))
"""[{'generated_text': '\
No, the hypothesis is not true. \
- The hypothesis states that goldfish are being caught weighing up to 8kg and one metre in length. \
- However, the premise states that goldfish are being caught weighing up to 2kg and koi carp up to 8kg and one metre in length. \
- The difference between the two is that the koi carp is weighing 8kg and the goldfish is weighing 2kg.'}]"""

Citation [optional]

BibTeX:

[More Information Needed]

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for nkwbtb/flan-t5-11b-SummaCoz

Finetuned
(1)
this model

Dataset used to train nkwbtb/flan-t5-11b-SummaCoz