chsafouane commited on
Commit
d278f54
·
verified ·
1 Parent(s): 98a3984

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +3 -13
README.md CHANGED
@@ -1,22 +1,15 @@
1
  ---
2
  tags:
3
- - autotrain
4
  - text-generation-inference
5
  - text-generation
6
  - peft
7
  library_name: transformers
8
  base_model: meta-llama/Meta-Llama-3-8B
9
  widget:
10
- - messages:
11
- - role: user
12
- content: What is your favorite condiment?
13
  license: other
14
  ---
15
 
16
- # Model Trained Using AutoTrain
17
-
18
- This model was trained using AutoTrain. For more information, please visit [AutoTrain](https://hf.co/docs/autotrain).
19
-
20
  # Usage
21
 
22
  ```python
@@ -33,14 +26,11 @@ model = AutoModelForCausalLM.from_pretrained(
33
  ).eval()
34
 
35
  # Prompt content: "hi"
36
- messages = [
37
- {"role": "user", "content": "hi"}
38
- ]
39
 
40
- input_ids = tokenizer.apply_chat_template(conversation=messages, tokenize=True, add_generation_prompt=True, return_tensors='pt')
41
  output_ids = model.generate(input_ids.to('cuda'))
42
  response = tokenizer.decode(output_ids[0][input_ids.shape[1]:], skip_special_tokens=True)
43
 
44
- # Model response: "Hello! How can I assist you today?"
45
  print(response)
46
  ```
 
1
  ---
2
  tags:
 
3
  - text-generation-inference
4
  - text-generation
5
  - peft
6
  library_name: transformers
7
  base_model: meta-llama/Meta-Llama-3-8B
8
  widget:
9
+ - input: Le médecin a prescrit de l'amoxicilline pour traiter l'infection pulmonaire chronique du patient diabétique.
 
 
10
  license: other
11
  ---
12
 
 
 
 
 
13
  # Usage
14
 
15
  ```python
 
26
  ).eval()
27
 
28
  # Prompt content: "hi"
29
+ input = "Le médecin a prescrit de l'amoxicilline pour traiter l'infection pulmonaire chronique du patient diabétique."
 
 
30
 
31
+ input_ids = tokenizer(input, tokenize=True, return_tensors='pt')
32
  output_ids = model.generate(input_ids.to('cuda'))
33
  response = tokenizer.decode(output_ids[0][input_ids.shape[1]:], skip_special_tokens=True)
34
 
 
35
  print(response)
36
  ```