Aditya149 commited on
Commit
aff3145
1 Parent(s): 5612197

End of training

Browse files
README.md ADDED
@@ -0,0 +1,213 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: other
3
+ library_name: peft
4
+ tags:
5
+ - trl
6
+ - sft
7
+ - generated_from_trainer
8
+ base_model: google/gemma-2b
9
+ model-index:
10
+ - name: Gemma-2b-chat
11
+ results: []
12
+ ---
13
+
14
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
15
+ should probably proofread and complete it, then remove this comment. -->
16
+
17
+ # Gemma-2b-chat
18
+
19
+ This model is a fine-tuned version of [google/gemma-2b](https://huggingface.co/google/gemma-2b) on the None dataset.
20
+ It achieves the following results on the evaluation set:
21
+ - Loss: 2.3272
22
+
23
+ ## Model description
24
+
25
+ More information needed
26
+
27
+ ## Intended uses & limitations
28
+
29
+ More information needed
30
+
31
+ ## Training and evaluation data
32
+
33
+ More information needed
34
+
35
+ ## Training procedure
36
+
37
+ ### Training hyperparameters
38
+
39
+ The following hyperparameters were used during training:
40
+ - learning_rate: 1e-05
41
+ - train_batch_size: 1
42
+ - eval_batch_size: 1
43
+ - seed: 42
44
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
45
+ - lr_scheduler_type: cosine
46
+ - num_epochs: 3
47
+ - mixed_precision_training: Native AMP
48
+
49
+ ### Training results
50
+
51
+ | Training Loss | Epoch | Step | Validation Loss |
52
+ |:-------------:|:-----:|:-----:|:---------------:|
53
+ | 2.974 | 0.02 | 100 | 2.9419 |
54
+ | 3.1998 | 0.04 | 200 | 2.9117 |
55
+ | 2.9839 | 0.06 | 300 | 2.8745 |
56
+ | 2.9379 | 0.08 | 400 | 2.8328 |
57
+ | 2.9392 | 0.1 | 500 | 2.7909 |
58
+ | 2.7735 | 0.12 | 600 | 2.7461 |
59
+ | 2.7835 | 0.14 | 700 | 2.7075 |
60
+ | 2.7794 | 0.16 | 800 | 2.6695 |
61
+ | 2.7355 | 0.18 | 900 | 2.6363 |
62
+ | 2.7805 | 0.2 | 1000 | 2.6104 |
63
+ | 2.556 | 0.22 | 1100 | 2.5871 |
64
+ | 2.596 | 0.24 | 1200 | 2.5652 |
65
+ | 2.5838 | 0.26 | 1300 | 2.5432 |
66
+ | 2.6598 | 0.28 | 1400 | 2.5247 |
67
+ | 2.5475 | 0.3 | 1500 | 2.5100 |
68
+ | 2.4459 | 0.31 | 1600 | 2.4954 |
69
+ | 2.502 | 0.33 | 1700 | 2.4829 |
70
+ | 2.557 | 0.35 | 1800 | 2.4702 |
71
+ | 2.4944 | 0.37 | 1900 | 2.4604 |
72
+ | 2.4774 | 0.39 | 2000 | 2.4528 |
73
+ | 2.4287 | 0.41 | 2100 | 2.4453 |
74
+ | 2.5386 | 0.43 | 2200 | 2.4381 |
75
+ | 2.363 | 0.45 | 2300 | 2.4322 |
76
+ | 2.514 | 0.47 | 2400 | 2.4272 |
77
+ | 2.413 | 0.49 | 2500 | 2.4225 |
78
+ | 2.4667 | 0.51 | 2600 | 2.4176 |
79
+ | 2.4724 | 0.53 | 2700 | 2.4128 |
80
+ | 2.3949 | 0.55 | 2800 | 2.4084 |
81
+ | 2.4822 | 0.57 | 2900 | 2.4044 |
82
+ | 2.4556 | 0.59 | 3000 | 2.4009 |
83
+ | 2.4067 | 0.61 | 3100 | 2.3977 |
84
+ | 2.3911 | 0.63 | 3200 | 2.3947 |
85
+ | 2.3446 | 0.65 | 3300 | 2.3923 |
86
+ | 2.3358 | 0.67 | 3400 | 2.3891 |
87
+ | 2.3213 | 0.69 | 3500 | 2.3867 |
88
+ | 2.4041 | 0.71 | 3600 | 2.3840 |
89
+ | 2.4759 | 0.73 | 3700 | 2.3818 |
90
+ | 2.4622 | 0.75 | 3800 | 2.3801 |
91
+ | 2.3512 | 0.77 | 3900 | 2.3778 |
92
+ | 2.3653 | 0.79 | 4000 | 2.3760 |
93
+ | 2.3455 | 0.81 | 4100 | 2.3744 |
94
+ | 2.4364 | 0.83 | 4200 | 2.3724 |
95
+ | 2.2805 | 0.85 | 4300 | 2.3706 |
96
+ | 2.5448 | 0.87 | 4400 | 2.3681 |
97
+ | 2.3061 | 0.89 | 4500 | 2.3674 |
98
+ | 2.2572 | 0.9 | 4600 | 2.3657 |
99
+ | 2.3259 | 0.92 | 4700 | 2.3645 |
100
+ | 2.4078 | 0.94 | 4800 | 2.3633 |
101
+ | 2.3841 | 0.96 | 4900 | 2.3618 |
102
+ | 2.5439 | 0.98 | 5000 | 2.3604 |
103
+ | 2.4556 | 1.0 | 5100 | 2.3593 |
104
+ | 2.3752 | 1.02 | 5200 | 2.3582 |
105
+ | 2.3415 | 1.04 | 5300 | 2.3567 |
106
+ | 2.2824 | 1.06 | 5400 | 2.3555 |
107
+ | 2.3748 | 1.08 | 5500 | 2.3541 |
108
+ | 2.2535 | 1.1 | 5600 | 2.3534 |
109
+ | 2.3277 | 1.12 | 5700 | 2.3530 |
110
+ | 2.394 | 1.14 | 5800 | 2.3518 |
111
+ | 2.4876 | 1.16 | 5900 | 2.3511 |
112
+ | 2.4705 | 1.18 | 6000 | 2.3503 |
113
+ | 2.4394 | 1.2 | 6100 | 2.3499 |
114
+ | 2.3898 | 1.22 | 6200 | 2.3488 |
115
+ | 2.3789 | 1.24 | 6300 | 2.3483 |
116
+ | 2.4315 | 1.26 | 6400 | 2.3472 |
117
+ | 2.4065 | 1.28 | 6500 | 2.3463 |
118
+ | 2.3331 | 1.3 | 6600 | 2.3456 |
119
+ | 2.3415 | 1.32 | 6700 | 2.3452 |
120
+ | 2.3433 | 1.34 | 6800 | 2.3448 |
121
+ | 2.337 | 1.36 | 6900 | 2.3434 |
122
+ | 2.4492 | 1.38 | 7000 | 2.3425 |
123
+ | 2.3757 | 1.4 | 7100 | 2.3419 |
124
+ | 2.4124 | 1.42 | 7200 | 2.3412 |
125
+ | 2.2778 | 1.44 | 7300 | 2.3408 |
126
+ | 2.3127 | 1.46 | 7400 | 2.3401 |
127
+ | 2.2558 | 1.48 | 7500 | 2.3398 |
128
+ | 2.4419 | 1.49 | 7600 | 2.3394 |
129
+ | 2.3052 | 1.51 | 7700 | 2.3388 |
130
+ | 2.3212 | 1.53 | 7800 | 2.3387 |
131
+ | 2.3989 | 1.55 | 7900 | 2.3376 |
132
+ | 2.3201 | 1.57 | 8000 | 2.3372 |
133
+ | 2.4111 | 1.59 | 8100 | 2.3364 |
134
+ | 2.3243 | 1.61 | 8200 | 2.3361 |
135
+ | 2.3158 | 1.63 | 8300 | 2.3360 |
136
+ | 2.3065 | 1.65 | 8400 | 2.3357 |
137
+ | 2.3627 | 1.67 | 8500 | 2.3353 |
138
+ | 2.4604 | 1.69 | 8600 | 2.3348 |
139
+ | 2.2451 | 1.71 | 8700 | 2.3346 |
140
+ | 2.3559 | 1.73 | 8800 | 2.3342 |
141
+ | 2.4832 | 1.75 | 8900 | 2.3338 |
142
+ | 2.5064 | 1.77 | 9000 | 2.3335 |
143
+ | 2.2961 | 1.79 | 9100 | 2.3336 |
144
+ | 2.4394 | 1.81 | 9200 | 2.3334 |
145
+ | 2.4337 | 1.83 | 9300 | 2.3332 |
146
+ | 2.2984 | 1.85 | 9400 | 2.3328 |
147
+ | 2.2544 | 1.87 | 9500 | 2.3325 |
148
+ | 2.4421 | 1.89 | 9600 | 2.3321 |
149
+ | 2.2737 | 1.91 | 9700 | 2.3322 |
150
+ | 2.4483 | 1.93 | 9800 | 2.3319 |
151
+ | 2.4371 | 1.95 | 9900 | 2.3314 |
152
+ | 2.3184 | 1.97 | 10000 | 2.3312 |
153
+ | 2.2936 | 1.99 | 10100 | 2.3308 |
154
+ | 2.432 | 2.01 | 10200 | 2.3304 |
155
+ | 2.3306 | 2.03 | 10300 | 2.3301 |
156
+ | 2.3926 | 2.05 | 10400 | 2.3301 |
157
+ | 2.358 | 2.07 | 10500 | 2.3300 |
158
+ | 2.341 | 2.08 | 10600 | 2.3298 |
159
+ | 2.3886 | 2.1 | 10700 | 2.3297 |
160
+ | 2.2559 | 2.12 | 10800 | 2.3296 |
161
+ | 2.4121 | 2.14 | 10900 | 2.3294 |
162
+ | 2.3301 | 2.16 | 11000 | 2.3292 |
163
+ | 2.2807 | 2.18 | 11100 | 2.3290 |
164
+ | 2.3028 | 2.2 | 11200 | 2.3288 |
165
+ | 2.2957 | 2.22 | 11300 | 2.3289 |
166
+ | 2.296 | 2.24 | 11400 | 2.3289 |
167
+ | 2.248 | 2.26 | 11500 | 2.3288 |
168
+ | 2.3639 | 2.28 | 11600 | 2.3286 |
169
+ | 2.4383 | 2.3 | 11700 | 2.3284 |
170
+ | 2.2921 | 2.32 | 11800 | 2.3282 |
171
+ | 2.4594 | 2.34 | 11900 | 2.3282 |
172
+ | 2.4243 | 2.36 | 12000 | 2.3280 |
173
+ | 2.344 | 2.38 | 12100 | 2.3280 |
174
+ | 2.3063 | 2.4 | 12200 | 2.3279 |
175
+ | 2.3875 | 2.42 | 12300 | 2.3280 |
176
+ | 2.3502 | 2.44 | 12400 | 2.3278 |
177
+ | 2.3034 | 2.46 | 12500 | 2.3278 |
178
+ | 2.4234 | 2.48 | 12600 | 2.3277 |
179
+ | 2.2829 | 2.5 | 12700 | 2.3277 |
180
+ | 2.3965 | 2.52 | 12800 | 2.3277 |
181
+ | 2.4046 | 2.54 | 12900 | 2.3274 |
182
+ | 2.3374 | 2.56 | 13000 | 2.3274 |
183
+ | 2.1988 | 2.58 | 13100 | 2.3274 |
184
+ | 2.3893 | 2.6 | 13200 | 2.3274 |
185
+ | 2.3621 | 2.62 | 13300 | 2.3273 |
186
+ | 2.2888 | 2.64 | 13400 | 2.3273 |
187
+ | 2.3928 | 2.66 | 13500 | 2.3273 |
188
+ | 2.3523 | 2.68 | 13600 | 2.3272 |
189
+ | 2.3158 | 2.69 | 13700 | 2.3273 |
190
+ | 2.3453 | 2.71 | 13800 | 2.3273 |
191
+ | 2.3113 | 2.73 | 13900 | 2.3272 |
192
+ | 2.3878 | 2.75 | 14000 | 2.3272 |
193
+ | 2.3361 | 2.77 | 14100 | 2.3273 |
194
+ | 2.2343 | 2.79 | 14200 | 2.3273 |
195
+ | 2.2963 | 2.81 | 14300 | 2.3271 |
196
+ | 2.252 | 2.83 | 14400 | 2.3272 |
197
+ | 2.4307 | 2.85 | 14500 | 2.3272 |
198
+ | 2.2778 | 2.87 | 14600 | 2.3272 |
199
+ | 2.3832 | 2.89 | 14700 | 2.3272 |
200
+ | 2.3611 | 2.91 | 14800 | 2.3272 |
201
+ | 2.3556 | 2.93 | 14900 | 2.3271 |
202
+ | 2.3712 | 2.95 | 15000 | 2.3272 |
203
+ | 2.3667 | 2.97 | 15100 | 2.3272 |
204
+ | 2.3816 | 2.99 | 15200 | 2.3272 |
205
+
206
+
207
+ ### Framework versions
208
+
209
+ - PEFT 0.9.0
210
+ - Transformers 4.38.2
211
+ - Pytorch 2.1.2
212
+ - Datasets 2.17.1
213
+ - Tokenizers 0.15.2
adapter_config.json CHANGED
@@ -10,7 +10,7 @@
10
  "layers_to_transform": null,
11
  "loftq_config": {},
12
  "lora_alpha": 16,
13
- "lora_dropout": 0.05,
14
  "megatron_config": null,
15
  "megatron_core": "megatron.core",
16
  "modules_to_save": null,
@@ -19,8 +19,8 @@
19
  "rank_pattern": {},
20
  "revision": null,
21
  "target_modules": [
22
- "v_proj",
23
- "q_proj"
24
  ],
25
  "task_type": "CAUSAL_LM",
26
  "use_dora": false,
 
10
  "layers_to_transform": null,
11
  "loftq_config": {},
12
  "lora_alpha": 16,
13
+ "lora_dropout": 0.0,
14
  "megatron_config": null,
15
  "megatron_core": "megatron.core",
16
  "modules_to_save": null,
 
19
  "rank_pattern": {},
20
  "revision": null,
21
  "target_modules": [
22
+ "q_proj",
23
+ "v_proj"
24
  ],
25
  "task_type": "CAUSAL_LM",
26
  "use_dora": false,
adapter_model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:76d87af9d8556f15d82b3c22d672465602b5445b11b8f438f93dfed94fb6eb9e
3
  size 930928
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:8e0515be8c6b5b90c1fd70a82868f5c3d5cf2c63dce4b3d538980f8eb08c0f39
3
  size 930928
runs/Mar01_09-30-17_3022972a886d/events.out.tfevents.1709285965.3022972a886d.34.0 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:caec59bdb2009755e4b34ee2c0b1b24b28a940774daaed65351e011b07622793
3
+ size 78614
special_tokens_map.json CHANGED
@@ -13,7 +13,13 @@
13
  "rstrip": false,
14
  "single_word": false
15
  },
16
- "pad_token": "<eos>",
 
 
 
 
 
 
17
  "unk_token": {
18
  "content": "<unk>",
19
  "lstrip": false,
 
13
  "rstrip": false,
14
  "single_word": false
15
  },
16
+ "pad_token": {
17
+ "content": "<pad>",
18
+ "lstrip": false,
19
+ "normalized": false,
20
+ "rstrip": false,
21
+ "single_word": false
22
+ },
23
  "unk_token": {
24
  "content": "<unk>",
25
  "lstrip": false,
tokenizer_config.json CHANGED
@@ -40,7 +40,7 @@
40
  "eos_token": "<eos>",
41
  "legacy": null,
42
  "model_max_length": 1000000000000000019884624838656,
43
- "pad_token": "<eos>",
44
  "sp_model_kwargs": {},
45
  "spaces_between_special_tokens": false,
46
  "tokenizer_class": "GemmaTokenizer",
 
40
  "eos_token": "<eos>",
41
  "legacy": null,
42
  "model_max_length": 1000000000000000019884624838656,
43
+ "pad_token": "<pad>",
44
  "sp_model_kwargs": {},
45
  "spaces_between_special_tokens": false,
46
  "tokenizer_class": "GemmaTokenizer",
training_args.bin CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:a7f4d690e760f5b59321fdddc8722eab1bb8b9892cf14b60f34001334a211cc2
3
- size 4920
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:4a49f648bf5624cbde70baada009a54ecf02095fffd5872212e6eea929ec90c4
3
+ size 4856