new version: LarkAI/codet5p-770m_nl2sql_oig
use oig-sql dataset and support more complex sql parse
How to Use
import torch
from transformers import AutoTokenizer, BartForConditionalGeneration
device = torch.device('cuda:0')
tokenizer = AutoTokenizer.from_pretrained("LarkAI/bart_large_nl2sql")
model = BartForConditionalGeneration.from_pretrained("LarkAI/bart_large_nl2sql").to(device)
text = "question: get people name with age less 25 table: id, name, age"
inputs = tokenizer([text], max_length=1024, return_tensors="pt")
output_ids = model.generate(inputs["input_ids"].to(device), num_beams=self.beams, max_length=128, min_length=8)
response_text = tokenizer.batch_decode(output_ids, skip_special_tokens=True, clean_up_tokenization_spaces=False)[0]
# SELECT name FROM table WHERE age < 25
reference: juierror/flan-t5-text2sql-with-schema - fix this discussion
How to Train
Quick start: https://github.com/huggingface/transformers/blob/main/examples/pytorch/summarization/README.md
- Downloads last month
- 111
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.