JiunYi commited on
Commit
d0aa1a3
1 Parent(s): f3ad309

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +57 -1
README.md CHANGED
@@ -1,5 +1,6 @@
1
  ---
2
  license: gpl-3.0
 
3
  language:
4
  - zh
5
  metrics:
@@ -7,4 +8,59 @@ metrics:
7
  pipeline_tag: text-generation
8
  tags:
9
  - art
10
- ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
  ---
2
  license: gpl-3.0
3
+ library_name: peft
4
  language:
5
  - zh
6
  metrics:
 
8
  pipeline_tag: text-generation
9
  tags:
10
  - art
11
+ - llama-factory
12
+ - lora
13
+ - generated_from_trainer
14
+ base_model: THUDM/chatglm3-6b
15
+ model-index:
16
+ - name: train_2024-05-02-07-20-40
17
+ results: []
18
+ ---
19
+
20
+
21
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
22
+ should probably proofread and complete it, then remove this comment. -->
23
+
24
+ # ChatGLM3-6B-Chat-DcardStylePost-SFT
25
+
26
+ This model is a fine-tuned version of [THUDM/chatglm3-6b](https://huggingface.co/THUDM/chatglm3-6b) on the dcardwom_zhcn_train dataset.
27
+
28
+ ## Model description
29
+
30
+ More information needed
31
+
32
+ ## Intended uses & limitations
33
+
34
+ More information needed
35
+
36
+ ## Training and evaluation data
37
+
38
+ More information needed
39
+
40
+ ## Training procedure
41
+
42
+ ### Training hyperparameters
43
+
44
+ The following hyperparameters were used during training:
45
+ - learning_rate: 5e-05
46
+ - train_batch_size: 2
47
+ - eval_batch_size: 8
48
+ - seed: 42
49
+ - gradient_accumulation_steps: 8
50
+ - total_train_batch_size: 16
51
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
52
+ - lr_scheduler_type: cosine
53
+ - num_epochs: 3.0
54
+ - mixed_precision_training: Native AMP
55
+
56
+ ### Training results
57
+
58
+
59
+
60
+ ### Framework versions
61
+
62
+ - PEFT 0.10.0
63
+ - Transformers 4.40.1
64
+ - Pytorch 2.2.0a0+81ea7a4
65
+ - Datasets 2.19.0
66
+ - Tokenizers 0.19.1