Text Generation
Transformers
NeMo
Safetensors
mistral
text-generation-inference
Inference Endpoints
srvm commited on
Commit
6628063
1 Parent(s): f5c6c67

Fix usage example

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -61,7 +61,7 @@ tokenizer = AutoTokenizer.from_pretrained(model_path)
61
 
62
  device = 'cuda'
63
  dtype = torch.bfloat16
64
- model = LlamaForCausalLM.from_pretrained(model_path, torch_dtype=dtype, device_map=device)
65
 
66
  # Prepare the input text
67
  prompt = 'Complete the paragraph: our solar system is'
 
61
 
62
  device = 'cuda'
63
  dtype = torch.bfloat16
64
+ model = AutoModelForCausalLM.from_pretrained(model_path, torch_dtype=dtype, device_map=device)
65
 
66
  # Prepare the input text
67
  prompt = 'Complete the paragraph: our solar system is'