## Inference | |
``` python | |
from transformers import AutoModel, AutoTokenizer, AutoModelForCausalLM | |
from peft import PeftModel | |
model = "meta-llama/Llama-2-7b-chat-hf" | |
tokenizer = AutoTokenizer.from_pretrained(model, trust_remote_code=True) | |
model = AutoModelForCausalLM.from_pretrained(model, trust_remote_code=True, device_map = 'cuda') | |
model = PeftModel.from_pretrained(model, "FinGPT/fingpt-forecaster_sz50_llama2-7B_lora") | |
model = model.eval() | |
``` |