pochunhsu commited on
Commit
418dd18
·
verified ·
1 Parent(s): 5092f76

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +3 -3
README.md CHANGED
@@ -6,9 +6,9 @@ language:
6
  - en
7
  ---
8
 
9
- # Model Card for Breeze-7B-Instruct-v1_0
10
 
11
- Breeze-7B is a language model family that builds on top of [Mistral-7B](https://huggingface.co/mistralai/Mistral-7B-v0.1), specifically intended for Traditional Chinese use.
12
 
13
  [Breeze-7B-Base](https://huggingface.co/MediaTek-Research/Breeze-7B-Base-v1_0) is the base model for the Breeze-7B series.
14
  It is suitable for use if you have substantial fine-tuning data to tune it for your specific use case.
@@ -17,7 +17,7 @@ It is suitable for use if you have substantial fine-tuning data to tune it for y
17
 
18
  The current release version of Breeze-7B is v1.0, which has undergone a more refined training process compared to Breeze-7B-v0_1, resulting in significantly improved performance in both English and Traditional Chinese.
19
 
20
- For details of this model please read our [paper](https://arxiv.org/abs/).
21
 
22
  Practicality-wise:
23
  - Breeze-7B-Base expands the original vocabulary with an additional 30,000 Traditional Chinese tokens. With the expanded vocabulary, and everything else being equal, Breeze-7B operates at twice the inference speed for Traditional Chinese to Mistral-7B and Llama 7B. [See [Inference Performance](#inference-performance).]
 
6
  - en
7
  ---
8
 
9
+ # Model Card for MediaTek Research Breeze-7B-Instruct-v1_0
10
 
11
+ MediaTek Research Breeze-7B (hereinafter referred to as Breeze-7B) is a language model family that builds on top of [Mistral-7B](https://huggingface.co/mistralai/Mistral-7B-v0.1), specifically intended for Traditional Chinese use.
12
 
13
  [Breeze-7B-Base](https://huggingface.co/MediaTek-Research/Breeze-7B-Base-v1_0) is the base model for the Breeze-7B series.
14
  It is suitable for use if you have substantial fine-tuning data to tune it for your specific use case.
 
17
 
18
  The current release version of Breeze-7B is v1.0, which has undergone a more refined training process compared to Breeze-7B-v0_1, resulting in significantly improved performance in both English and Traditional Chinese.
19
 
20
+ For details of this model please read our [paper](https://arxiv.org/abs/2403.02712).
21
 
22
  Practicality-wise:
23
  - Breeze-7B-Base expands the original vocabulary with an additional 30,000 Traditional Chinese tokens. With the expanded vocabulary, and everything else being equal, Breeze-7B operates at twice the inference speed for Traditional Chinese to Mistral-7B and Llama 7B. [See [Inference Performance](#inference-performance).]