Edit model card

Uses

from transformers import T5Tokenizer, T5ForConditionalGeneration

tokenizer = T5Tokenizer.from_pretrained("alpeshsonar/lot-t5-small-filter")
model = T5ForConditionalGeneration.from_pretrained("alpeshsonar/lot-t5-small-filter")

input_text = """"Extract lots from given text.
* age ≥18 years * patients with de novo or secondary AML, with an unfavorable or intermediate karyotype (according to the 2017 ELN classification), or patients with relapsing AML who may receive second-line treatment * not candidates for intensive induction, for the following reasons* 75 years or ≥ 18 to 74 years and at least one of the following comorbidities: PS ≥ 2 or a history of heart failure requiring treatment or LVEF ≤ 50% or chronic stable angina or FEV1 ≤ 65% or DLCO ≤ 65% or creatinine clearance <45 ml / min; or liver damage with total bilirubin> 1.5 N or other comorbidities that the hematologist considers incompatible with intensive treatment * ineligible for a classic allogeneic hematopoietic stem cell transplant due to the presence of co-morbidities or too high a risk of toxicity >70 years old or at least one of the following comorbidities: PS ≥ 2 or a history of heart failure requiring treatment or LVEF ≤ 50% or chronic stable angina or FEV1 ≤ 65% or DLCO ≤ 65% or creatinine clearance <45 ml / min; or liver damage with total bilirubin> 1.5 N * may receive chemotherapy with hypomethylating agents have a partially compatible (haplo-identical) major family donor (≥18 years old) eligible for lymphocyte donation.
"""
input_ids = tokenizer(input_text, return_tensors="pt").input_ids
outputs = model.generate(input_ids,max_new_tokens=1024)
print(tokenizer.decode(outputs[0],skip_special_tokens=True))
Downloads last month
7
Safetensors
Model size
60.5M params
Tensor type
BF16
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for alpeshsonar/lot-t5-small-filter

Base model

google-t5/t5-small
Finetuned
(1456)
this model

Space using alpeshsonar/lot-t5-small-filter 1