taka-too commited on
Commit
5c916f2
·
verified ·
1 Parent(s): cdb748d

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +17 -1
README.md CHANGED
@@ -9,6 +9,7 @@ tags:
9
  license: cc-by-nc-sa-4.0
10
  language:
11
  - jp
 
12
  ---
13
 
14
 
@@ -17,7 +18,22 @@ language:
17
  - **Developed by:** taka-too
18
  - **License:** CC-BY-NC-SA-4.0
19
  - **Finetuned from model :** llm-jp/llm-jp-3-13b
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
20
 
21
- This llama model was trained 2x faster with [Unsloth](https://github.com/unslothai/unsloth) and Huggingface's TRL library.
22
 
23
  [<img src="https://raw.githubusercontent.com/unslothai/unsloth/main/images/unsloth%20made%20with%20love.png" width="200"/>](https://github.com/unslothai/unsloth)
 
9
  license: cc-by-nc-sa-4.0
10
  language:
11
  - jp
12
+ Training Dataset: Ichikara Instruction (LLM-jp)
13
  ---
14
 
15
 
 
18
  - **Developed by:** taka-too
19
  - **License:** CC-BY-NC-SA-4.0
20
  - **Finetuned from model :** llm-jp/llm-jp-3-13b
21
+ - **Training Dataset:** Ichikara Instruction (LLM-jp)
22
+
23
+ This LLaMA-based model has been fine-tuned for enhanced instruction-following capabilities using the Ichikara Instruction dataset provided by LLM-jp,
24
+ which was trained 2x faster with [Unsloth](https://github.com/unslothai/unsloth) and Huggingface's TRL library.
25
+
26
+ 関根聡, 安藤まや, 後藤美知子, 鈴木久美, 河原大輔, 井之上直也, 乾健太郎. ichikara-instruction: LLMのための日本語インストラクションデータの構築. 言語処理学会第30回年次大会(2024)
27
+
28
+ # How to Use the Model
29
+
30
+ You can load the model via the Hugging Face transformers library:
31
+ '''python
32
+ from transformers import AutoModelForCausalLM, AutoTokenizer
33
+
34
+ tokenizer = AutoTokenizer.from_pretrained("taka-too/llm-jp-3-13b-it")
35
+ model = AutoModelForCausalLM.from_pretrained("taka-too/llm-jp-3-13b-it")
36
+ '''
37
 
 
38
 
39
  [<img src="https://raw.githubusercontent.com/unslothai/unsloth/main/images/unsloth%20made%20with%20love.png" width="200"/>](https://github.com/unslothai/unsloth)