File size: 451 Bytes
2c4a79c |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 |
## Inference
``` python
from transformers import AutoModel, AutoTokenizer, AutoModelForCausalLM
from peft import PeftModel
model = "meta-llama/Llama-2-7b-chat-hf"
tokenizer = AutoTokenizer.from_pretrained(model, trust_remote_code=True)
model = AutoModelForCausalLM.from_pretrained(model, trust_remote_code=True, device_map = 'cuda')
model = PeftModel.from_pretrained(model, "FinGPT/fingpt-forecaster_sz50_llama2-7B_lora")
model = model.eval()
``` |