rubenroy commited on
Commit
c3cac38
·
verified ·
1 Parent(s): 02ed83e

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +3 -3
README.md CHANGED
@@ -19,7 +19,7 @@ pipeline_tag: text-generation
19
  library_name: transformers
20
  ---
21
 
22
- ![Zunich Banner](https://cdn.ruben-roy.com/AI/Zurich/img/banner-7b-500k.png)
23
 
24
  # Zurich 7B GammaCorpus v2-500k
25
  *A Qwen 2.5 model fine-tuned on the GammaCorpus dataset*
@@ -57,7 +57,7 @@ Here is a code snippet with `apply_chat_template` to show you how to load the to
57
  ```python
58
  from transformers import AutoModelForCausalLM, AutoTokenizer
59
 
60
- model_name = "rubenroy/Zurich-7b-GCv2-500k"
61
 
62
  model = AutoModelForCausalLM.from_pretrained(
63
  model_name,
@@ -68,7 +68,7 @@ tokenizer = AutoTokenizer.from_pretrained(model_name)
68
 
69
  prompt = "How tall is the Eiffel tower?"
70
  messages = [
71
- {"role": "system", "content": "You are Zurich, an AI assistant built on the Qwen 2.5 7b model developed by Alibaba Cloud, and fine-tuned by Ruben Roy. You are a helpful assistant."},
72
  {"role": "user", "content": prompt}
73
  ]
74
  text = tokenizer.apply_chat_template(
 
19
  library_name: transformers
20
  ---
21
 
22
+ ![Zunich Banner](https://cdn.ruben-roy.com/AI/Zurich/img/banner-7B-500k.png)
23
 
24
  # Zurich 7B GammaCorpus v2-500k
25
  *A Qwen 2.5 model fine-tuned on the GammaCorpus dataset*
 
57
  ```python
58
  from transformers import AutoModelForCausalLM, AutoTokenizer
59
 
60
+ model_name = "rubenroy/Zurich-7B-GCv2-500k"
61
 
62
  model = AutoModelForCausalLM.from_pretrained(
63
  model_name,
 
68
 
69
  prompt = "How tall is the Eiffel tower?"
70
  messages = [
71
+ {"role": "system", "content": "You are Zurich, an AI assistant built on the Qwen 2.5 7B model developed by Alibaba Cloud, and fine-tuned by Ruben Roy. You are a helpful assistant."},
72
  {"role": "user", "content": prompt}
73
  ]
74
  text = tokenizer.apply_chat_template(