Menouar's picture
Update README.md
48000ee verified
|
raw
history blame
1.17 kB
metadata
library_name: transformers
tags:
  - falcon
  - QLoRA
  - falcon7b-linear-equations
license: apache-2.0
datasets:
  - Menouar/LinearEquations
pipeline_tag: text-generation

falcon7b-linear-equations-merged

This model is a merged version of falcon7b-linear-equations with QLoRA.

from transformers import AutoTokenizer, AutoModelForCausalLM, pipeline

model = AutoModelForCausalLM.from_pretrained("Menouar/falcon7b-linear-equations-merged")
tokenizer = AutoTokenizer.from_pretrained("Menouar/falcon7b-linear-equations-merged",
                                          device_map="auto",
                                          torch_dtype=torch.float16)

pipe = pipeline("text-generation", model=model, tokenizer=tokenizer)

outputs = pipe("Solve for y: 10 + 4y -9y +5 = 4 +8y - 2y", 
               max_new_tokens=172, 
               do_sample=True, 
               temperature=0.1,
               top_k=50, top_p=0.1,
               eos_token_id=pipe.tokenizer.eos_token_id,
               pad_token_id=pipe.tokenizer.pad_token_id)

for seq in outputs:
    print(f"{seq['generated_text']}")