HODACHI commited on
Commit
a59fa1e
·
verified ·
1 Parent(s): 2e125bc

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +11 -13
README.md CHANGED
@@ -6,20 +6,20 @@ tags:
6
  - conversational
7
  ---
8
 
9
- # EZO model card
10
  ![image/png](https://cdn-uploads.huggingface.co/production/uploads/657e900beaad53ff67ba84db/0OYFqT8kACowa9bY1EZF6.png)
11
  **Terms of Use**: [Terms](https://www.kaggle.com/models/google/gemma/license/consent/verify/huggingface?returnModelRepoId=google/gemma-2-9b-it)
12
  **Authors**: Axcxept co., ltd.
13
 
14
- ## Model Information
15
  This model is based on Gemma-2-9B-it, enhanced with multiple tuning techniques to improve its general performance. While it excels in Japanese language tasks, it's designed to meet diverse needs globally.
16
 
17
  Gemma-2-9B-itをベースとして、複数のチューニング手法を採用のうえ、汎用的に性能を向上させたモデルです。日本語タスクに優れつつ、世界中の多様なニーズに応える設計となっています。
18
 
19
- ### Benchmark Results
20
  ![image/png](https://cdn-uploads.huggingface.co/production/uploads/657e900beaad53ff67ba84db/XyPo_1rVa_ufmV5SeLepQ.png)
21
 
22
- ### Usage
23
  Here are some code snippets to quickly get started with the model. First, run:
24
  `pip install -U transformers`
25
  Then, copy the snippet from the relevant section for your use case.
@@ -29,7 +29,7 @@ Then, copy the snippet from the relevant section for your use case.
29
  `pip install -U transformers`
30
  を実行し、使用例に関連するセクションのスニペットをコピーしてください。
31
 
32
- ### Chat Template
33
  ```py
34
  from transformers import AutoTokenizer, AutoModelForCausalLM
35
  import transformers
@@ -50,7 +50,7 @@ outputs = model.generate(input_ids=inputs.to(model.device), max_new_tokens=150)
50
  print(tokenizer.decode(outputs[0]))
51
  ```
52
 
53
- ### Template
54
  ```
55
  <bos><start_of_turn>user
56
  Write a hello world program<end_of_turn>
@@ -58,10 +58,8 @@ Write a hello world program<end_of_turn>
58
  XXXXXX<end_of_turn><eos>
59
  ```
60
 
61
- ### Model Data
62
- Information about the data used for model training and how it was processed.
63
-
64
- #### Training Dataset
65
  We extracted high-quality data from Japanese Wikipedia and FineWeb to create instruction data. Our innovative training approach allows for performance improvements across various languages and domains, making the model suitable for global use despite its focus on Japanese data.
66
 
67
  日本語のWikiデータおよび、FineWebから良質なデータのみを抽出し、Instructionデータを作成しました。このモデルでは日本語に特化させていますが、世界中のどんなユースケースでも利用可能なアプローチです。
@@ -69,7 +67,7 @@ We extracted high-quality data from Japanese Wikipedia and FineWeb to create ins
69
  https://huggingface.co/datasets/legacy-datasets/wikipedia
70
  https://huggingface.co/datasets/HuggingFaceFW/fineweb
71
 
72
- ### Data Preprocessing
73
  We used a plain instruction tuning method to train the model on exemplary responses. This approach enhances the model's ability to understand and generate high-quality responses across various languages and contexts.
74
 
75
  プレインストラクトチューニング手法を用いて、模範的回答を学習させました。この手法により、モデルは様々な言語やコンテキストにおいて高品質な応答を理解し生成する能力が向上しています。
@@ -79,9 +77,9 @@ We used a plain instruction tuning method to train the model on exemplary respon
79
 
80
  https://huggingface.co/instruction-pretrain/instruction-synthesizer
81
 
82
- ### Hardware
83
  A100 × 4(Running in 32h)
84
 
85
- ### We are.
86
  [![Axcxept logo](https://cdn-uploads.huggingface.co/production/uploads/657e900beaad53ff67ba84db/8OKW86U986ywttvL2RcbG.png)](https://axcxept.com)
87
 
 
6
  - conversational
7
  ---
8
 
9
+ # [EZO model card]
10
  ![image/png](https://cdn-uploads.huggingface.co/production/uploads/657e900beaad53ff67ba84db/0OYFqT8kACowa9bY1EZF6.png)
11
  **Terms of Use**: [Terms](https://www.kaggle.com/models/google/gemma/license/consent/verify/huggingface?returnModelRepoId=google/gemma-2-9b-it)
12
  **Authors**: Axcxept co., ltd.
13
 
14
+ ## [Model Information]
15
  This model is based on Gemma-2-9B-it, enhanced with multiple tuning techniques to improve its general performance. While it excels in Japanese language tasks, it's designed to meet diverse needs globally.
16
 
17
  Gemma-2-9B-itをベースとして、複数のチューニング手法を採用のうえ、汎用的に性能を向上させたモデルです。日本語タスクに優れつつ、世界中の多様なニーズに応える設計となっています。
18
 
19
+ ### [Benchmark Results]
20
  ![image/png](https://cdn-uploads.huggingface.co/production/uploads/657e900beaad53ff67ba84db/XyPo_1rVa_ufmV5SeLepQ.png)
21
 
22
+ ### [Usage]
23
  Here are some code snippets to quickly get started with the model. First, run:
24
  `pip install -U transformers`
25
  Then, copy the snippet from the relevant section for your use case.
 
29
  `pip install -U transformers`
30
  を実行し、使用例に関連するセクションのスニペットをコピーしてください。
31
 
32
+ ### [Chat Template]
33
  ```py
34
  from transformers import AutoTokenizer, AutoModelForCausalLM
35
  import transformers
 
50
  print(tokenizer.decode(outputs[0]))
51
  ```
52
 
53
+ ### [Template]
54
  ```
55
  <bos><start_of_turn>user
56
  Write a hello world program<end_of_turn>
 
58
  XXXXXX<end_of_turn><eos>
59
  ```
60
 
61
+ ### [Model Data]
62
+ #### Training Dataset]
 
 
63
  We extracted high-quality data from Japanese Wikipedia and FineWeb to create instruction data. Our innovative training approach allows for performance improvements across various languages and domains, making the model suitable for global use despite its focus on Japanese data.
64
 
65
  日本語のWikiデータおよび、FineWebから良質なデータのみを抽出し、Instructionデータを作成しました。このモデルでは日本語に特化させていますが、世界中のどんなユースケースでも利用可能なアプローチです。
 
67
  https://huggingface.co/datasets/legacy-datasets/wikipedia
68
  https://huggingface.co/datasets/HuggingFaceFW/fineweb
69
 
70
+ #### Data Preprocessing
71
  We used a plain instruction tuning method to train the model on exemplary responses. This approach enhances the model's ability to understand and generate high-quality responses across various languages and contexts.
72
 
73
  プレインストラクトチューニング手法を用いて、模範的回答を学習させました。この手法により、モデルは様々な言語やコンテキストにおいて高品質な応答を理解し生成する能力が向上しています。
 
77
 
78
  https://huggingface.co/instruction-pretrain/instruction-synthesizer
79
 
80
+ ### [Hardware]
81
  A100 × 4(Running in 32h)
82
 
83
+ ### [We are.]
84
  [![Axcxept logo](https://cdn-uploads.huggingface.co/production/uploads/657e900beaad53ff67ba84db/8OKW86U986ywttvL2RcbG.png)](https://axcxept.com)
85