aayush14 commited on
Commit
05f67a6
·
verified ·
1 Parent(s): 2d33570

Uploaded PeptideGPT non-hemolytic model

Browse files
README.md ADDED
@@ -0,0 +1,559 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ base_model: nferruz/ProtGPT2
4
+ tags:
5
+ - generated_from_trainer
6
+ metrics:
7
+ - accuracy
8
+ model-index:
9
+ - name: output_hemo_neg_3
10
+ results: []
11
+ ---
12
+
13
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
14
+ should probably proofread and complete it, then remove this comment. -->
15
+
16
+ # output_hemo_neg_3
17
+
18
+ This model is a fine-tuned version of [nferruz/ProtGPT2](https://huggingface.co/nferruz/ProtGPT2) on an unknown dataset.
19
+ It achieves the following results on the evaluation set:
20
+ - Loss: 3.9794
21
+ - Accuracy: 0.4240
22
+
23
+ ## Model description
24
+
25
+ More information needed
26
+
27
+ ## Intended uses & limitations
28
+
29
+ More information needed
30
+
31
+ ## Training and evaluation data
32
+
33
+ More information needed
34
+
35
+ ## Training procedure
36
+
37
+ ### Training hyperparameters
38
+
39
+ The following hyperparameters were used during training:
40
+ - learning_rate: 1e-06
41
+ - train_batch_size: 1
42
+ - eval_batch_size: 8
43
+ - seed: 42
44
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
45
+ - lr_scheduler_type: linear
46
+ - num_epochs: 500.0
47
+
48
+ ### Training results
49
+
50
+ | Training Loss | Epoch | Step | Validation Loss | Accuracy |
51
+ |:-------------:|:-----:|:-----:|:---------------:|:--------:|
52
+ | 5.9415 | 1.0 | 38 | 5.6068 | 0.3084 |
53
+ | 5.7302 | 2.0 | 76 | 5.4263 | 0.3204 |
54
+ | 5.5675 | 3.0 | 114 | 5.2875 | 0.3231 |
55
+ | 5.4594 | 4.0 | 152 | 5.2055 | 0.3250 |
56
+ | 5.3808 | 5.0 | 190 | 5.1589 | 0.3297 |
57
+ | 5.3353 | 6.0 | 228 | 5.1195 | 0.3321 |
58
+ | 5.2946 | 7.0 | 266 | 5.0779 | 0.3338 |
59
+ | 5.2632 | 8.0 | 304 | 5.0432 | 0.3370 |
60
+ | 5.2279 | 9.0 | 342 | 5.0154 | 0.3372 |
61
+ | 5.1999 | 10.0 | 380 | 4.9931 | 0.3377 |
62
+ | 5.1853 | 11.0 | 418 | 4.9701 | 0.3399 |
63
+ | 5.1619 | 12.0 | 456 | 4.9458 | 0.3429 |
64
+ | 5.1395 | 13.0 | 494 | 4.9274 | 0.3438 |
65
+ | 5.1179 | 14.0 | 532 | 4.9080 | 0.3463 |
66
+ | 5.1048 | 15.0 | 570 | 4.8921 | 0.3465 |
67
+ | 5.0837 | 16.0 | 608 | 4.8756 | 0.3470 |
68
+ | 5.067 | 17.0 | 646 | 4.8606 | 0.3492 |
69
+ | 5.0516 | 18.0 | 684 | 4.8469 | 0.3507 |
70
+ | 5.0313 | 19.0 | 722 | 4.8366 | 0.3522 |
71
+ | 5.0225 | 20.0 | 760 | 4.8276 | 0.3526 |
72
+ | 5.0068 | 21.0 | 798 | 4.8179 | 0.3522 |
73
+ | 4.9942 | 22.0 | 836 | 4.8051 | 0.3522 |
74
+ | 4.9758 | 23.0 | 874 | 4.7963 | 0.3526 |
75
+ | 4.9605 | 24.0 | 912 | 4.7843 | 0.3529 |
76
+ | 4.9525 | 25.0 | 950 | 4.7728 | 0.3531 |
77
+ | 4.9409 | 26.0 | 988 | 4.7618 | 0.3524 |
78
+ | 4.9328 | 27.0 | 1026 | 4.7523 | 0.3519 |
79
+ | 4.9168 | 28.0 | 1064 | 4.7444 | 0.3526 |
80
+ | 4.9057 | 29.0 | 1102 | 4.7332 | 0.3551 |
81
+ | 4.8896 | 30.0 | 1140 | 4.7237 | 0.3561 |
82
+ | 4.8869 | 31.0 | 1178 | 4.7156 | 0.3565 |
83
+ | 4.8798 | 32.0 | 1216 | 4.7093 | 0.3568 |
84
+ | 4.8591 | 33.0 | 1254 | 4.7029 | 0.3575 |
85
+ | 4.8548 | 34.0 | 1292 | 4.6946 | 0.3570 |
86
+ | 4.8502 | 35.0 | 1330 | 4.6871 | 0.3595 |
87
+ | 4.8378 | 36.0 | 1368 | 4.6803 | 0.3595 |
88
+ | 4.829 | 37.0 | 1406 | 4.6733 | 0.3600 |
89
+ | 4.8177 | 38.0 | 1444 | 4.6643 | 0.3602 |
90
+ | 4.809 | 39.0 | 1482 | 4.6591 | 0.3607 |
91
+ | 4.8002 | 40.0 | 1520 | 4.6507 | 0.3607 |
92
+ | 4.7938 | 41.0 | 1558 | 4.6438 | 0.3614 |
93
+ | 4.7787 | 42.0 | 1596 | 4.6367 | 0.3617 |
94
+ | 4.7685 | 43.0 | 1634 | 4.6306 | 0.3629 |
95
+ | 4.762 | 44.0 | 1672 | 4.6211 | 0.3636 |
96
+ | 4.7487 | 45.0 | 1710 | 4.6133 | 0.3641 |
97
+ | 4.7451 | 46.0 | 1748 | 4.6058 | 0.3646 |
98
+ | 4.7378 | 47.0 | 1786 | 4.6009 | 0.3658 |
99
+ | 4.7281 | 48.0 | 1824 | 4.5932 | 0.3658 |
100
+ | 4.7196 | 49.0 | 1862 | 4.5889 | 0.3656 |
101
+ | 4.7091 | 50.0 | 1900 | 4.5814 | 0.3666 |
102
+ | 4.7032 | 51.0 | 1938 | 4.5763 | 0.3668 |
103
+ | 4.6978 | 52.0 | 1976 | 4.5731 | 0.3668 |
104
+ | 4.6908 | 53.0 | 2014 | 4.5682 | 0.3673 |
105
+ | 4.6776 | 54.0 | 2052 | 4.5638 | 0.3673 |
106
+ | 4.6667 | 55.0 | 2090 | 4.5588 | 0.3680 |
107
+ | 4.6662 | 56.0 | 2128 | 4.5535 | 0.3685 |
108
+ | 4.6567 | 57.0 | 2166 | 4.5494 | 0.3697 |
109
+ | 4.6492 | 58.0 | 2204 | 4.5433 | 0.3697 |
110
+ | 4.6442 | 59.0 | 2242 | 4.5421 | 0.3697 |
111
+ | 4.632 | 60.0 | 2280 | 4.5368 | 0.3700 |
112
+ | 4.6256 | 61.0 | 2318 | 4.5321 | 0.3705 |
113
+ | 4.6215 | 62.0 | 2356 | 4.5286 | 0.3700 |
114
+ | 4.6142 | 63.0 | 2394 | 4.5240 | 0.3702 |
115
+ | 4.6041 | 64.0 | 2432 | 4.5195 | 0.3710 |
116
+ | 4.5984 | 65.0 | 2470 | 4.5147 | 0.3715 |
117
+ | 4.5919 | 66.0 | 2508 | 4.5116 | 0.3727 |
118
+ | 4.5838 | 67.0 | 2546 | 4.5070 | 0.3724 |
119
+ | 4.5733 | 68.0 | 2584 | 4.5035 | 0.3724 |
120
+ | 4.5642 | 69.0 | 2622 | 4.5007 | 0.3722 |
121
+ | 4.5607 | 70.0 | 2660 | 4.4968 | 0.3719 |
122
+ | 4.5543 | 71.0 | 2698 | 4.4928 | 0.3729 |
123
+ | 4.5502 | 72.0 | 2736 | 4.4897 | 0.3729 |
124
+ | 4.5505 | 73.0 | 2774 | 4.4875 | 0.3737 |
125
+ | 4.537 | 74.0 | 2812 | 4.4840 | 0.3732 |
126
+ | 4.529 | 75.0 | 2850 | 4.4802 | 0.3746 |
127
+ | 4.5201 | 76.0 | 2888 | 4.4764 | 0.3749 |
128
+ | 4.5176 | 77.0 | 2926 | 4.4729 | 0.3751 |
129
+ | 4.5087 | 78.0 | 2964 | 4.4716 | 0.3751 |
130
+ | 4.504 | 79.0 | 3002 | 4.4684 | 0.3744 |
131
+ | 4.4914 | 80.0 | 3040 | 4.4634 | 0.3751 |
132
+ | 4.4907 | 81.0 | 3078 | 4.4616 | 0.3751 |
133
+ | 4.483 | 82.0 | 3116 | 4.4578 | 0.3754 |
134
+ | 4.4792 | 83.0 | 3154 | 4.4541 | 0.3741 |
135
+ | 4.4705 | 84.0 | 3192 | 4.4511 | 0.3744 |
136
+ | 4.4647 | 85.0 | 3230 | 4.4488 | 0.3749 |
137
+ | 4.4617 | 86.0 | 3268 | 4.4445 | 0.3751 |
138
+ | 4.453 | 87.0 | 3306 | 4.4385 | 0.3751 |
139
+ | 4.4488 | 88.0 | 3344 | 4.4353 | 0.3763 |
140
+ | 4.4424 | 89.0 | 3382 | 4.4322 | 0.3766 |
141
+ | 4.433 | 90.0 | 3420 | 4.4300 | 0.3766 |
142
+ | 4.4252 | 91.0 | 3458 | 4.4259 | 0.3763 |
143
+ | 4.4226 | 92.0 | 3496 | 4.4215 | 0.3773 |
144
+ | 4.4144 | 93.0 | 3534 | 4.4189 | 0.3771 |
145
+ | 4.4047 | 94.0 | 3572 | 4.4160 | 0.3771 |
146
+ | 4.4071 | 95.0 | 3610 | 4.4131 | 0.3773 |
147
+ | 4.3975 | 96.0 | 3648 | 4.4095 | 0.3773 |
148
+ | 4.3897 | 97.0 | 3686 | 4.4085 | 0.3771 |
149
+ | 4.3869 | 98.0 | 3724 | 4.4052 | 0.3771 |
150
+ | 4.3751 | 99.0 | 3762 | 4.4021 | 0.3773 |
151
+ | 4.3698 | 100.0 | 3800 | 4.3988 | 0.3768 |
152
+ | 4.368 | 101.0 | 3838 | 4.3945 | 0.3768 |
153
+ | 4.3643 | 102.0 | 3876 | 4.3918 | 0.3771 |
154
+ | 4.3552 | 103.0 | 3914 | 4.3893 | 0.3766 |
155
+ | 4.3478 | 104.0 | 3952 | 4.3869 | 0.3776 |
156
+ | 4.3438 | 105.0 | 3990 | 4.3848 | 0.3781 |
157
+ | 4.3362 | 106.0 | 4028 | 4.3820 | 0.3773 |
158
+ | 4.3356 | 107.0 | 4066 | 4.3768 | 0.3778 |
159
+ | 4.3263 | 108.0 | 4104 | 4.3764 | 0.3776 |
160
+ | 4.3238 | 109.0 | 4142 | 4.3732 | 0.3778 |
161
+ | 4.3157 | 110.0 | 4180 | 4.3699 | 0.3781 |
162
+ | 4.311 | 111.0 | 4218 | 4.3678 | 0.3781 |
163
+ | 4.3048 | 112.0 | 4256 | 4.3646 | 0.3788 |
164
+ | 4.2955 | 113.0 | 4294 | 4.3640 | 0.3793 |
165
+ | 4.2914 | 114.0 | 4332 | 4.3604 | 0.3793 |
166
+ | 4.286 | 115.0 | 4370 | 4.3580 | 0.3790 |
167
+ | 4.2857 | 116.0 | 4408 | 4.3541 | 0.3790 |
168
+ | 4.2776 | 117.0 | 4446 | 4.3527 | 0.3793 |
169
+ | 4.2734 | 118.0 | 4484 | 4.3482 | 0.3803 |
170
+ | 4.2646 | 119.0 | 4522 | 4.3461 | 0.3800 |
171
+ | 4.2632 | 120.0 | 4560 | 4.3446 | 0.3803 |
172
+ | 4.2586 | 121.0 | 4598 | 4.3409 | 0.3807 |
173
+ | 4.2564 | 122.0 | 4636 | 4.3400 | 0.3812 |
174
+ | 4.2423 | 123.0 | 4674 | 4.3357 | 0.3807 |
175
+ | 4.2425 | 124.0 | 4712 | 4.3335 | 0.3807 |
176
+ | 4.2367 | 125.0 | 4750 | 4.3306 | 0.3810 |
177
+ | 4.2301 | 126.0 | 4788 | 4.3292 | 0.3815 |
178
+ | 4.2286 | 127.0 | 4826 | 4.3276 | 0.3812 |
179
+ | 4.2184 | 128.0 | 4864 | 4.3246 | 0.3822 |
180
+ | 4.2156 | 129.0 | 4902 | 4.3210 | 0.3827 |
181
+ | 4.2116 | 130.0 | 4940 | 4.3187 | 0.3834 |
182
+ | 4.2008 | 131.0 | 4978 | 4.3165 | 0.3834 |
183
+ | 4.1995 | 132.0 | 5016 | 4.3134 | 0.3834 |
184
+ | 4.19 | 133.0 | 5054 | 4.3136 | 0.3842 |
185
+ | 4.1828 | 134.0 | 5092 | 4.3116 | 0.3842 |
186
+ | 4.1815 | 135.0 | 5130 | 4.3065 | 0.3847 |
187
+ | 4.1771 | 136.0 | 5168 | 4.3051 | 0.3839 |
188
+ | 4.1744 | 137.0 | 5206 | 4.3016 | 0.3847 |
189
+ | 4.1717 | 138.0 | 5244 | 4.2975 | 0.3847 |
190
+ | 4.1616 | 139.0 | 5282 | 4.2966 | 0.3847 |
191
+ | 4.1582 | 140.0 | 5320 | 4.2948 | 0.3847 |
192
+ | 4.1583 | 141.0 | 5358 | 4.2931 | 0.3849 |
193
+ | 4.148 | 142.0 | 5396 | 4.2894 | 0.3854 |
194
+ | 4.1417 | 143.0 | 5434 | 4.2861 | 0.3849 |
195
+ | 4.1386 | 144.0 | 5472 | 4.2865 | 0.3861 |
196
+ | 4.133 | 145.0 | 5510 | 4.2834 | 0.3861 |
197
+ | 4.129 | 146.0 | 5548 | 4.2793 | 0.3864 |
198
+ | 4.12 | 147.0 | 5586 | 4.2785 | 0.3861 |
199
+ | 4.1206 | 148.0 | 5624 | 4.2750 | 0.3864 |
200
+ | 4.1226 | 149.0 | 5662 | 4.2744 | 0.3871 |
201
+ | 4.1104 | 150.0 | 5700 | 4.2723 | 0.3866 |
202
+ | 4.1093 | 151.0 | 5738 | 4.2677 | 0.3871 |
203
+ | 4.0989 | 152.0 | 5776 | 4.2654 | 0.3869 |
204
+ | 4.1035 | 153.0 | 5814 | 4.2646 | 0.3878 |
205
+ | 4.0949 | 154.0 | 5852 | 4.2635 | 0.3881 |
206
+ | 4.0921 | 155.0 | 5890 | 4.2606 | 0.3883 |
207
+ | 4.0883 | 156.0 | 5928 | 4.2565 | 0.3886 |
208
+ | 4.0794 | 157.0 | 5966 | 4.2558 | 0.3893 |
209
+ | 4.0754 | 158.0 | 6004 | 4.2530 | 0.3888 |
210
+ | 4.0756 | 159.0 | 6042 | 4.2496 | 0.3893 |
211
+ | 4.067 | 160.0 | 6080 | 4.2501 | 0.3888 |
212
+ | 4.0627 | 161.0 | 6118 | 4.2484 | 0.3891 |
213
+ | 4.0586 | 162.0 | 6156 | 4.2439 | 0.3898 |
214
+ | 4.0577 | 163.0 | 6194 | 4.2431 | 0.3893 |
215
+ | 4.055 | 164.0 | 6232 | 4.2391 | 0.3895 |
216
+ | 4.0419 | 165.0 | 6270 | 4.2396 | 0.3895 |
217
+ | 4.0411 | 166.0 | 6308 | 4.2365 | 0.3903 |
218
+ | 4.0405 | 167.0 | 6346 | 4.2356 | 0.3908 |
219
+ | 4.0327 | 168.0 | 6384 | 4.2349 | 0.3905 |
220
+ | 4.0262 | 169.0 | 6422 | 4.2312 | 0.3913 |
221
+ | 4.0252 | 170.0 | 6460 | 4.2300 | 0.3913 |
222
+ | 4.0237 | 171.0 | 6498 | 4.2254 | 0.3915 |
223
+ | 4.024 | 172.0 | 6536 | 4.2248 | 0.3920 |
224
+ | 4.0137 | 173.0 | 6574 | 4.2218 | 0.3922 |
225
+ | 4.0108 | 174.0 | 6612 | 4.2224 | 0.3927 |
226
+ | 4.0037 | 175.0 | 6650 | 4.2190 | 0.3939 |
227
+ | 4.0021 | 176.0 | 6688 | 4.2180 | 0.3937 |
228
+ | 3.9949 | 177.0 | 6726 | 4.2150 | 0.3942 |
229
+ | 3.9957 | 178.0 | 6764 | 4.2135 | 0.3939 |
230
+ | 3.9923 | 179.0 | 6802 | 4.2094 | 0.3942 |
231
+ | 3.9853 | 180.0 | 6840 | 4.2092 | 0.3949 |
232
+ | 3.9779 | 181.0 | 6878 | 4.2086 | 0.3949 |
233
+ | 3.9826 | 182.0 | 6916 | 4.2045 | 0.3947 |
234
+ | 3.9775 | 183.0 | 6954 | 4.2012 | 0.3949 |
235
+ | 3.9706 | 184.0 | 6992 | 4.2005 | 0.3961 |
236
+ | 3.9672 | 185.0 | 7030 | 4.1992 | 0.3957 |
237
+ | 3.9707 | 186.0 | 7068 | 4.1964 | 0.3966 |
238
+ | 3.9585 | 187.0 | 7106 | 4.1951 | 0.3971 |
239
+ | 3.9552 | 188.0 | 7144 | 4.1927 | 0.3966 |
240
+ | 3.9526 | 189.0 | 7182 | 4.1922 | 0.3966 |
241
+ | 3.9514 | 190.0 | 7220 | 4.1886 | 0.3969 |
242
+ | 3.9464 | 191.0 | 7258 | 4.1886 | 0.3976 |
243
+ | 3.9433 | 192.0 | 7296 | 4.1856 | 0.3981 |
244
+ | 3.9378 | 193.0 | 7334 | 4.1846 | 0.3978 |
245
+ | 3.9362 | 194.0 | 7372 | 4.1831 | 0.3981 |
246
+ | 3.9307 | 195.0 | 7410 | 4.1820 | 0.3981 |
247
+ | 3.9324 | 196.0 | 7448 | 4.1767 | 0.3978 |
248
+ | 3.9223 | 197.0 | 7486 | 4.1794 | 0.3983 |
249
+ | 3.9279 | 198.0 | 7524 | 4.1752 | 0.3986 |
250
+ | 3.9214 | 199.0 | 7562 | 4.1727 | 0.3981 |
251
+ | 3.9122 | 200.0 | 7600 | 4.1746 | 0.3988 |
252
+ | 3.9099 | 201.0 | 7638 | 4.1698 | 0.3996 |
253
+ | 3.9075 | 202.0 | 7676 | 4.1692 | 0.3993 |
254
+ | 3.9095 | 203.0 | 7714 | 4.1661 | 0.4000 |
255
+ | 3.9 | 204.0 | 7752 | 4.1637 | 0.4008 |
256
+ | 3.9004 | 205.0 | 7790 | 4.1619 | 0.4003 |
257
+ | 3.8978 | 206.0 | 7828 | 4.1603 | 0.4005 |
258
+ | 3.8918 | 207.0 | 7866 | 4.1583 | 0.4005 |
259
+ | 3.8848 | 208.0 | 7904 | 4.1580 | 0.4008 |
260
+ | 3.8831 | 209.0 | 7942 | 4.1577 | 0.4000 |
261
+ | 3.8821 | 210.0 | 7980 | 4.1550 | 0.4005 |
262
+ | 3.8818 | 211.0 | 8018 | 4.1522 | 0.4008 |
263
+ | 3.8764 | 212.0 | 8056 | 4.1521 | 0.4008 |
264
+ | 3.8704 | 213.0 | 8094 | 4.1491 | 0.4010 |
265
+ | 3.8725 | 214.0 | 8132 | 4.1492 | 0.4010 |
266
+ | 3.8698 | 215.0 | 8170 | 4.1470 | 0.4010 |
267
+ | 3.8654 | 216.0 | 8208 | 4.1465 | 0.4018 |
268
+ | 3.8608 | 217.0 | 8246 | 4.1451 | 0.4020 |
269
+ | 3.8584 | 218.0 | 8284 | 4.1422 | 0.4015 |
270
+ | 3.8546 | 219.0 | 8322 | 4.1412 | 0.4025 |
271
+ | 3.8494 | 220.0 | 8360 | 4.1408 | 0.4022 |
272
+ | 3.8479 | 221.0 | 8398 | 4.1384 | 0.4025 |
273
+ | 3.8463 | 222.0 | 8436 | 4.1365 | 0.4025 |
274
+ | 3.8422 | 223.0 | 8474 | 4.1326 | 0.4030 |
275
+ | 3.8395 | 224.0 | 8512 | 4.1333 | 0.4022 |
276
+ | 3.8369 | 225.0 | 8550 | 4.1338 | 0.4035 |
277
+ | 3.8357 | 226.0 | 8588 | 4.1299 | 0.4047 |
278
+ | 3.8318 | 227.0 | 8626 | 4.1298 | 0.4042 |
279
+ | 3.8258 | 228.0 | 8664 | 4.1298 | 0.4040 |
280
+ | 3.8265 | 229.0 | 8702 | 4.1276 | 0.4044 |
281
+ | 3.8229 | 230.0 | 8740 | 4.1266 | 0.4042 |
282
+ | 3.8139 | 231.0 | 8778 | 4.1253 | 0.4042 |
283
+ | 3.8132 | 232.0 | 8816 | 4.1251 | 0.4047 |
284
+ | 3.8126 | 233.0 | 8854 | 4.1229 | 0.4047 |
285
+ | 3.8074 | 234.0 | 8892 | 4.1216 | 0.4064 |
286
+ | 3.8072 | 235.0 | 8930 | 4.1218 | 0.4066 |
287
+ | 3.8056 | 236.0 | 8968 | 4.1169 | 0.4066 |
288
+ | 3.8038 | 237.0 | 9006 | 4.1169 | 0.4066 |
289
+ | 3.8025 | 238.0 | 9044 | 4.1151 | 0.4066 |
290
+ | 3.7948 | 239.0 | 9082 | 4.1146 | 0.4069 |
291
+ | 3.7929 | 240.0 | 9120 | 4.1120 | 0.4066 |
292
+ | 3.7922 | 241.0 | 9158 | 4.1118 | 0.4069 |
293
+ | 3.7897 | 242.0 | 9196 | 4.1092 | 0.4076 |
294
+ | 3.7877 | 243.0 | 9234 | 4.1080 | 0.4079 |
295
+ | 3.7829 | 244.0 | 9272 | 4.1083 | 0.4071 |
296
+ | 3.7814 | 245.0 | 9310 | 4.1087 | 0.4076 |
297
+ | 3.781 | 246.0 | 9348 | 4.1043 | 0.4071 |
298
+ | 3.7728 | 247.0 | 9386 | 4.1022 | 0.4081 |
299
+ | 3.779 | 248.0 | 9424 | 4.1015 | 0.4081 |
300
+ | 3.7716 | 249.0 | 9462 | 4.1030 | 0.4079 |
301
+ | 3.7674 | 250.0 | 9500 | 4.0995 | 0.4079 |
302
+ | 3.7665 | 251.0 | 9538 | 4.0991 | 0.4086 |
303
+ | 3.7603 | 252.0 | 9576 | 4.1002 | 0.4074 |
304
+ | 3.7645 | 253.0 | 9614 | 4.0957 | 0.4086 |
305
+ | 3.7622 | 254.0 | 9652 | 4.0959 | 0.4084 |
306
+ | 3.7583 | 255.0 | 9690 | 4.0955 | 0.4084 |
307
+ | 3.752 | 256.0 | 9728 | 4.0930 | 0.4086 |
308
+ | 3.7545 | 257.0 | 9766 | 4.0912 | 0.4091 |
309
+ | 3.7447 | 258.0 | 9804 | 4.0923 | 0.4091 |
310
+ | 3.7483 | 259.0 | 9842 | 4.0894 | 0.4086 |
311
+ | 3.7428 | 260.0 | 9880 | 4.0910 | 0.4086 |
312
+ | 3.7407 | 261.0 | 9918 | 4.0877 | 0.4086 |
313
+ | 3.7405 | 262.0 | 9956 | 4.0891 | 0.4091 |
314
+ | 3.7354 | 263.0 | 9994 | 4.0870 | 0.4088 |
315
+ | 3.7353 | 264.0 | 10032 | 4.0856 | 0.4086 |
316
+ | 3.7312 | 265.0 | 10070 | 4.0838 | 0.4091 |
317
+ | 3.7313 | 266.0 | 10108 | 4.0829 | 0.4091 |
318
+ | 3.7264 | 267.0 | 10146 | 4.0827 | 0.4091 |
319
+ | 3.7221 | 268.0 | 10184 | 4.0815 | 0.4093 |
320
+ | 3.7211 | 269.0 | 10222 | 4.0801 | 0.4091 |
321
+ | 3.7232 | 270.0 | 10260 | 4.0787 | 0.4093 |
322
+ | 3.718 | 271.0 | 10298 | 4.0780 | 0.4101 |
323
+ | 3.7208 | 272.0 | 10336 | 4.0771 | 0.4108 |
324
+ | 3.7109 | 273.0 | 10374 | 4.0766 | 0.4115 |
325
+ | 3.7146 | 274.0 | 10412 | 4.0739 | 0.4110 |
326
+ | 3.7071 | 275.0 | 10450 | 4.0737 | 0.4118 |
327
+ | 3.7044 | 276.0 | 10488 | 4.0742 | 0.4123 |
328
+ | 3.7094 | 277.0 | 10526 | 4.0719 | 0.4125 |
329
+ | 3.7028 | 278.0 | 10564 | 4.0718 | 0.4120 |
330
+ | 3.7051 | 279.0 | 10602 | 4.0699 | 0.4120 |
331
+ | 3.7011 | 280.0 | 10640 | 4.0681 | 0.4125 |
332
+ | 3.6954 | 281.0 | 10678 | 4.0668 | 0.4120 |
333
+ | 3.6933 | 282.0 | 10716 | 4.0669 | 0.4123 |
334
+ | 3.6935 | 283.0 | 10754 | 4.0638 | 0.4125 |
335
+ | 3.6867 | 284.0 | 10792 | 4.0650 | 0.4125 |
336
+ | 3.6888 | 285.0 | 10830 | 4.0641 | 0.4120 |
337
+ | 3.6843 | 286.0 | 10868 | 4.0638 | 0.4115 |
338
+ | 3.6824 | 287.0 | 10906 | 4.0621 | 0.4125 |
339
+ | 3.6821 | 288.0 | 10944 | 4.0603 | 0.4123 |
340
+ | 3.6802 | 289.0 | 10982 | 4.0622 | 0.4125 |
341
+ | 3.6789 | 290.0 | 11020 | 4.0579 | 0.4128 |
342
+ | 3.6767 | 291.0 | 11058 | 4.0579 | 0.4130 |
343
+ | 3.6751 | 292.0 | 11096 | 4.0582 | 0.4137 |
344
+ | 3.6726 | 293.0 | 11134 | 4.0556 | 0.4137 |
345
+ | 3.6704 | 294.0 | 11172 | 4.0583 | 0.4137 |
346
+ | 3.6703 | 295.0 | 11210 | 4.0556 | 0.4142 |
347
+ | 3.6662 | 296.0 | 11248 | 4.0518 | 0.4147 |
348
+ | 3.6643 | 297.0 | 11286 | 4.0521 | 0.4147 |
349
+ | 3.6623 | 298.0 | 11324 | 4.0544 | 0.4145 |
350
+ | 3.6626 | 299.0 | 11362 | 4.0518 | 0.4147 |
351
+ | 3.661 | 300.0 | 11400 | 4.0496 | 0.4147 |
352
+ | 3.6553 | 301.0 | 11438 | 4.0482 | 0.4150 |
353
+ | 3.6573 | 302.0 | 11476 | 4.0472 | 0.4147 |
354
+ | 3.6548 | 303.0 | 11514 | 4.0460 | 0.4152 |
355
+ | 3.6531 | 304.0 | 11552 | 4.0470 | 0.4147 |
356
+ | 3.6549 | 305.0 | 11590 | 4.0461 | 0.4150 |
357
+ | 3.6485 | 306.0 | 11628 | 4.0461 | 0.4147 |
358
+ | 3.6441 | 307.0 | 11666 | 4.0465 | 0.4150 |
359
+ | 3.6438 | 308.0 | 11704 | 4.0425 | 0.4159 |
360
+ | 3.6435 | 309.0 | 11742 | 4.0410 | 0.4157 |
361
+ | 3.6397 | 310.0 | 11780 | 4.0407 | 0.4159 |
362
+ | 3.6363 | 311.0 | 11818 | 4.0424 | 0.4154 |
363
+ | 3.6315 | 312.0 | 11856 | 4.0436 | 0.4154 |
364
+ | 3.6323 | 313.0 | 11894 | 4.0409 | 0.4157 |
365
+ | 3.6386 | 314.0 | 11932 | 4.0386 | 0.4157 |
366
+ | 3.6303 | 315.0 | 11970 | 4.0389 | 0.4154 |
367
+ | 3.6336 | 316.0 | 12008 | 4.0394 | 0.4164 |
368
+ | 3.6281 | 317.0 | 12046 | 4.0389 | 0.4167 |
369
+ | 3.6249 | 318.0 | 12084 | 4.0379 | 0.4176 |
370
+ | 3.6277 | 319.0 | 12122 | 4.0371 | 0.4176 |
371
+ | 3.6232 | 320.0 | 12160 | 4.0353 | 0.4172 |
372
+ | 3.6177 | 321.0 | 12198 | 4.0363 | 0.4176 |
373
+ | 3.626 | 322.0 | 12236 | 4.0319 | 0.4174 |
374
+ | 3.6181 | 323.0 | 12274 | 4.0319 | 0.4172 |
375
+ | 3.6183 | 324.0 | 12312 | 4.0329 | 0.4176 |
376
+ | 3.6169 | 325.0 | 12350 | 4.0328 | 0.4176 |
377
+ | 3.6094 | 326.0 | 12388 | 4.0318 | 0.4179 |
378
+ | 3.6138 | 327.0 | 12426 | 4.0294 | 0.4179 |
379
+ | 3.6101 | 328.0 | 12464 | 4.0311 | 0.4181 |
380
+ | 3.6062 | 329.0 | 12502 | 4.0299 | 0.4184 |
381
+ | 3.6093 | 330.0 | 12540 | 4.0276 | 0.4181 |
382
+ | 3.6071 | 331.0 | 12578 | 4.0301 | 0.4181 |
383
+ | 3.6064 | 332.0 | 12616 | 4.0277 | 0.4184 |
384
+ | 3.5982 | 333.0 | 12654 | 4.0288 | 0.4184 |
385
+ | 3.6064 | 334.0 | 12692 | 4.0256 | 0.4179 |
386
+ | 3.6023 | 335.0 | 12730 | 4.0252 | 0.4184 |
387
+ | 3.5992 | 336.0 | 12768 | 4.0240 | 0.4186 |
388
+ | 3.5997 | 337.0 | 12806 | 4.0237 | 0.4189 |
389
+ | 3.5955 | 338.0 | 12844 | 4.0235 | 0.4186 |
390
+ | 3.5929 | 339.0 | 12882 | 4.0233 | 0.4186 |
391
+ | 3.5953 | 340.0 | 12920 | 4.0210 | 0.4189 |
392
+ | 3.5915 | 341.0 | 12958 | 4.0210 | 0.4184 |
393
+ | 3.5835 | 342.0 | 12996 | 4.0226 | 0.4189 |
394
+ | 3.5852 | 343.0 | 13034 | 4.0227 | 0.4189 |
395
+ | 3.5894 | 344.0 | 13072 | 4.0222 | 0.4191 |
396
+ | 3.5864 | 345.0 | 13110 | 4.0227 | 0.4194 |
397
+ | 3.5854 | 346.0 | 13148 | 4.0190 | 0.4194 |
398
+ | 3.5841 | 347.0 | 13186 | 4.0180 | 0.4191 |
399
+ | 3.5821 | 348.0 | 13224 | 4.0189 | 0.4194 |
400
+ | 3.5823 | 349.0 | 13262 | 4.0176 | 0.4191 |
401
+ | 3.5772 | 350.0 | 13300 | 4.0164 | 0.4191 |
402
+ | 3.5827 | 351.0 | 13338 | 4.0147 | 0.4186 |
403
+ | 3.5747 | 352.0 | 13376 | 4.0148 | 0.4194 |
404
+ | 3.5745 | 353.0 | 13414 | 4.0169 | 0.4194 |
405
+ | 3.576 | 354.0 | 13452 | 4.0162 | 0.4194 |
406
+ | 3.5723 | 355.0 | 13490 | 4.0123 | 0.4194 |
407
+ | 3.5669 | 356.0 | 13528 | 4.0144 | 0.4196 |
408
+ | 3.5721 | 357.0 | 13566 | 4.0136 | 0.4189 |
409
+ | 3.5725 | 358.0 | 13604 | 4.0124 | 0.4194 |
410
+ | 3.5627 | 359.0 | 13642 | 4.0129 | 0.4196 |
411
+ | 3.5632 | 360.0 | 13680 | 4.0127 | 0.4194 |
412
+ | 3.5641 | 361.0 | 13718 | 4.0104 | 0.4196 |
413
+ | 3.5636 | 362.0 | 13756 | 4.0100 | 0.4194 |
414
+ | 3.5566 | 363.0 | 13794 | 4.0127 | 0.4194 |
415
+ | 3.5556 | 364.0 | 13832 | 4.0131 | 0.4198 |
416
+ | 3.5606 | 365.0 | 13870 | 4.0108 | 0.4194 |
417
+ | 3.5573 | 366.0 | 13908 | 4.0095 | 0.4196 |
418
+ | 3.5603 | 367.0 | 13946 | 4.0079 | 0.4191 |
419
+ | 3.5552 | 368.0 | 13984 | 4.0073 | 0.4191 |
420
+ | 3.5594 | 369.0 | 14022 | 4.0080 | 0.4194 |
421
+ | 3.5557 | 370.0 | 14060 | 4.0067 | 0.4194 |
422
+ | 3.5523 | 371.0 | 14098 | 4.0065 | 0.4196 |
423
+ | 3.5516 | 372.0 | 14136 | 4.0070 | 0.4194 |
424
+ | 3.5466 | 373.0 | 14174 | 4.0073 | 0.4196 |
425
+ | 3.5474 | 374.0 | 14212 | 4.0040 | 0.4194 |
426
+ | 3.5481 | 375.0 | 14250 | 4.0032 | 0.4196 |
427
+ | 3.5496 | 376.0 | 14288 | 4.0051 | 0.4194 |
428
+ | 3.5489 | 377.0 | 14326 | 4.0035 | 0.4194 |
429
+ | 3.5439 | 378.0 | 14364 | 4.0032 | 0.4198 |
430
+ | 3.5464 | 379.0 | 14402 | 4.0029 | 0.4206 |
431
+ | 3.5455 | 380.0 | 14440 | 4.0037 | 0.4198 |
432
+ | 3.5439 | 381.0 | 14478 | 4.0024 | 0.4206 |
433
+ | 3.542 | 382.0 | 14516 | 4.0011 | 0.4203 |
434
+ | 3.5366 | 383.0 | 14554 | 4.0011 | 0.4203 |
435
+ | 3.5368 | 384.0 | 14592 | 4.0015 | 0.4206 |
436
+ | 3.5382 | 385.0 | 14630 | 4.0018 | 0.4211 |
437
+ | 3.5358 | 386.0 | 14668 | 4.0002 | 0.4201 |
438
+ | 3.5324 | 387.0 | 14706 | 3.9990 | 0.4198 |
439
+ | 3.5378 | 388.0 | 14744 | 4.0002 | 0.4206 |
440
+ | 3.5334 | 389.0 | 14782 | 3.9985 | 0.4208 |
441
+ | 3.5349 | 390.0 | 14820 | 3.9987 | 0.4211 |
442
+ | 3.5378 | 391.0 | 14858 | 3.9984 | 0.4211 |
443
+ | 3.5304 | 392.0 | 14896 | 3.9977 | 0.4206 |
444
+ | 3.5241 | 393.0 | 14934 | 3.9985 | 0.4213 |
445
+ | 3.527 | 394.0 | 14972 | 3.9997 | 0.4211 |
446
+ | 3.5261 | 395.0 | 15010 | 3.9985 | 0.4211 |
447
+ | 3.5233 | 396.0 | 15048 | 3.9983 | 0.4216 |
448
+ | 3.5279 | 397.0 | 15086 | 3.9966 | 0.4213 |
449
+ | 3.5276 | 398.0 | 15124 | 3.9958 | 0.4213 |
450
+ | 3.5214 | 399.0 | 15162 | 3.9957 | 0.4213 |
451
+ | 3.5222 | 400.0 | 15200 | 3.9958 | 0.4211 |
452
+ | 3.5163 | 401.0 | 15238 | 3.9957 | 0.4213 |
453
+ | 3.5208 | 402.0 | 15276 | 3.9953 | 0.4218 |
454
+ | 3.5168 | 403.0 | 15314 | 3.9949 | 0.4218 |
455
+ | 3.5242 | 404.0 | 15352 | 3.9941 | 0.4216 |
456
+ | 3.5205 | 405.0 | 15390 | 3.9937 | 0.4213 |
457
+ | 3.5158 | 406.0 | 15428 | 3.9949 | 0.4218 |
458
+ | 3.517 | 407.0 | 15466 | 3.9939 | 0.4213 |
459
+ | 3.519 | 408.0 | 15504 | 3.9944 | 0.4216 |
460
+ | 3.5164 | 409.0 | 15542 | 3.9929 | 0.4213 |
461
+ | 3.5133 | 410.0 | 15580 | 3.9925 | 0.4211 |
462
+ | 3.5199 | 411.0 | 15618 | 3.9906 | 0.4211 |
463
+ | 3.5117 | 412.0 | 15656 | 3.9920 | 0.4216 |
464
+ | 3.5151 | 413.0 | 15694 | 3.9906 | 0.4218 |
465
+ | 3.5093 | 414.0 | 15732 | 3.9914 | 0.4218 |
466
+ | 3.512 | 415.0 | 15770 | 3.9909 | 0.4216 |
467
+ | 3.5076 | 416.0 | 15808 | 3.9912 | 0.4218 |
468
+ | 3.5059 | 417.0 | 15846 | 3.9916 | 0.4220 |
469
+ | 3.5096 | 418.0 | 15884 | 3.9907 | 0.4213 |
470
+ | 3.5038 | 419.0 | 15922 | 3.9902 | 0.4213 |
471
+ | 3.5089 | 420.0 | 15960 | 3.9895 | 0.4216 |
472
+ | 3.5091 | 421.0 | 15998 | 3.9893 | 0.4213 |
473
+ | 3.5101 | 422.0 | 16036 | 3.9890 | 0.4218 |
474
+ | 3.5061 | 423.0 | 16074 | 3.9900 | 0.4220 |
475
+ | 3.5048 | 424.0 | 16112 | 3.9888 | 0.4218 |
476
+ | 3.501 | 425.0 | 16150 | 3.9881 | 0.4218 |
477
+ | 3.5067 | 426.0 | 16188 | 3.9877 | 0.4218 |
478
+ | 3.5037 | 427.0 | 16226 | 3.9866 | 0.4223 |
479
+ | 3.5052 | 428.0 | 16264 | 3.9855 | 0.4223 |
480
+ | 3.5049 | 429.0 | 16302 | 3.9862 | 0.4223 |
481
+ | 3.5017 | 430.0 | 16340 | 3.9873 | 0.4228 |
482
+ | 3.5038 | 431.0 | 16378 | 3.9872 | 0.4228 |
483
+ | 3.5072 | 432.0 | 16416 | 3.9853 | 0.4225 |
484
+ | 3.5009 | 433.0 | 16454 | 3.9849 | 0.4225 |
485
+ | 3.5023 | 434.0 | 16492 | 3.9856 | 0.4228 |
486
+ | 3.4982 | 435.0 | 16530 | 3.9860 | 0.4228 |
487
+ | 3.4927 | 436.0 | 16568 | 3.9859 | 0.4230 |
488
+ | 3.4959 | 437.0 | 16606 | 3.9861 | 0.4230 |
489
+ | 3.4984 | 438.0 | 16644 | 3.9860 | 0.4228 |
490
+ | 3.5005 | 439.0 | 16682 | 3.9847 | 0.4230 |
491
+ | 3.4947 | 440.0 | 16720 | 3.9845 | 0.4230 |
492
+ | 3.4964 | 441.0 | 16758 | 3.9843 | 0.4230 |
493
+ | 3.4955 | 442.0 | 16796 | 3.9844 | 0.4233 |
494
+ | 3.4923 | 443.0 | 16834 | 3.9843 | 0.4233 |
495
+ | 3.4993 | 444.0 | 16872 | 3.9842 | 0.4230 |
496
+ | 3.4889 | 445.0 | 16910 | 3.9846 | 0.4233 |
497
+ | 3.487 | 446.0 | 16948 | 3.9855 | 0.4233 |
498
+ | 3.4965 | 447.0 | 16986 | 3.9851 | 0.4233 |
499
+ | 3.4873 | 448.0 | 17024 | 3.9852 | 0.4233 |
500
+ | 3.4936 | 449.0 | 17062 | 3.9847 | 0.4233 |
501
+ | 3.494 | 450.0 | 17100 | 3.9841 | 0.4233 |
502
+ | 3.4855 | 451.0 | 17138 | 3.9836 | 0.4233 |
503
+ | 3.4898 | 452.0 | 17176 | 3.9830 | 0.4230 |
504
+ | 3.4866 | 453.0 | 17214 | 3.9831 | 0.4233 |
505
+ | 3.4866 | 454.0 | 17252 | 3.9831 | 0.4235 |
506
+ | 3.4886 | 455.0 | 17290 | 3.9836 | 0.4238 |
507
+ | 3.4874 | 456.0 | 17328 | 3.9838 | 0.4238 |
508
+ | 3.486 | 457.0 | 17366 | 3.9838 | 0.4238 |
509
+ | 3.4869 | 458.0 | 17404 | 3.9835 | 0.4235 |
510
+ | 3.4845 | 459.0 | 17442 | 3.9833 | 0.4238 |
511
+ | 3.4849 | 460.0 | 17480 | 3.9825 | 0.4238 |
512
+ | 3.4841 | 461.0 | 17518 | 3.9818 | 0.4235 |
513
+ | 3.4924 | 462.0 | 17556 | 3.9814 | 0.4235 |
514
+ | 3.571 | 463.0 | 17594 | 3.9815 | 0.4235 |
515
+ | 3.4811 | 464.0 | 17632 | 3.9813 | 0.4235 |
516
+ | 3.4851 | 465.0 | 17670 | 3.9810 | 0.4235 |
517
+ | 3.4776 | 466.0 | 17708 | 3.9813 | 0.4238 |
518
+ | 3.4849 | 467.0 | 17746 | 3.9810 | 0.4235 |
519
+ | 3.4766 | 468.0 | 17784 | 3.9813 | 0.4238 |
520
+ | 3.4791 | 469.0 | 17822 | 3.9815 | 0.4238 |
521
+ | 3.4814 | 470.0 | 17860 | 3.9813 | 0.4238 |
522
+ | 3.4861 | 471.0 | 17898 | 3.9809 | 0.4238 |
523
+ | 3.4861 | 472.0 | 17936 | 3.9806 | 0.4235 |
524
+ | 3.4825 | 473.0 | 17974 | 3.9809 | 0.4235 |
525
+ | 3.4758 | 474.0 | 18012 | 3.9811 | 0.4235 |
526
+ | 3.4811 | 475.0 | 18050 | 3.9807 | 0.4235 |
527
+ | 3.4831 | 476.0 | 18088 | 3.9808 | 0.4238 |
528
+ | 3.4837 | 477.0 | 18126 | 3.9803 | 0.4238 |
529
+ | 3.4843 | 478.0 | 18164 | 3.9803 | 0.4240 |
530
+ | 3.4825 | 479.0 | 18202 | 3.9802 | 0.4240 |
531
+ | 3.4807 | 480.0 | 18240 | 3.9800 | 0.4240 |
532
+ | 3.4808 | 481.0 | 18278 | 3.9797 | 0.4240 |
533
+ | 3.4805 | 482.0 | 18316 | 3.9797 | 0.4240 |
534
+ | 3.4818 | 483.0 | 18354 | 3.9796 | 0.4240 |
535
+ | 3.4821 | 484.0 | 18392 | 3.9794 | 0.4240 |
536
+ | 3.4802 | 485.0 | 18430 | 3.9794 | 0.4240 |
537
+ | 3.4805 | 486.0 | 18468 | 3.9796 | 0.4240 |
538
+ | 3.4831 | 487.0 | 18506 | 3.9796 | 0.4240 |
539
+ | 3.4846 | 488.0 | 18544 | 3.9798 | 0.4240 |
540
+ | 3.4824 | 489.0 | 18582 | 3.9798 | 0.4240 |
541
+ | 3.4807 | 490.0 | 18620 | 3.9799 | 0.4240 |
542
+ | 3.4809 | 491.0 | 18658 | 3.9799 | 0.4240 |
543
+ | 3.4801 | 492.0 | 18696 | 3.9799 | 0.4238 |
544
+ | 3.479 | 493.0 | 18734 | 3.9799 | 0.4238 |
545
+ | 3.48 | 494.0 | 18772 | 3.9799 | 0.4238 |
546
+ | 3.4828 | 495.0 | 18810 | 3.9799 | 0.4238 |
547
+ | 3.4812 | 496.0 | 18848 | 3.9799 | 0.4238 |
548
+ | 3.4798 | 497.0 | 18886 | 3.9799 | 0.4238 |
549
+ | 3.4866 | 498.0 | 18924 | 3.9799 | 0.4238 |
550
+ | 3.4785 | 499.0 | 18962 | 3.9799 | 0.4238 |
551
+ | 3.4893 | 500.0 | 19000 | 3.9799 | 0.4238 |
552
+
553
+
554
+ ### Framework versions
555
+
556
+ - Transformers 4.38.0.dev0
557
+ - Pytorch 2.2.0
558
+ - Datasets 2.16.1
559
+ - Tokenizers 0.15.1
all_results.json ADDED
@@ -0,0 +1,15 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "epoch": 500.0,
3
+ "eval_accuracy": 0.4239980449657869,
4
+ "eval_loss": 3.9793689250946045,
5
+ "eval_runtime": 0.6297,
6
+ "eval_samples": 4,
7
+ "eval_samples_per_second": 6.352,
8
+ "eval_steps_per_second": 1.588,
9
+ "perplexity": 53.4832716246652,
10
+ "train_loss": 3.948820646587171,
11
+ "train_runtime": 16522.6397,
12
+ "train_samples": 38,
13
+ "train_samples_per_second": 1.15,
14
+ "train_steps_per_second": 1.15
15
+ }
config.json ADDED
@@ -0,0 +1,39 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "nferruz/ProtGPT2",
3
+ "activation_function": "gelu_new",
4
+ "architectures": [
5
+ "GPT2LMHeadModel"
6
+ ],
7
+ "attn_pdrop": 0.1,
8
+ "bos_token_id": 0,
9
+ "embd_pdrop": 0.1,
10
+ "eos_token_id": 0,
11
+ "initializer_range": 0.02,
12
+ "layer_norm_epsilon": 1e-05,
13
+ "model_type": "gpt2",
14
+ "n_ctx": 1024,
15
+ "n_embd": 1280,
16
+ "n_head": 20,
17
+ "n_inner": null,
18
+ "n_layer": 36,
19
+ "n_positions": 1024,
20
+ "reorder_and_upcast_attn": false,
21
+ "resid_pdrop": 0.1,
22
+ "scale_attn_by_inverse_layer_idx": false,
23
+ "scale_attn_weights": true,
24
+ "summary_activation": null,
25
+ "summary_first_dropout": 0.1,
26
+ "summary_proj_to_labels": true,
27
+ "summary_type": "cls_index",
28
+ "summary_use_proj": true,
29
+ "task_specific_params": {
30
+ "text-generation": {
31
+ "do_sample": true,
32
+ "max_length": 50
33
+ }
34
+ },
35
+ "torch_dtype": "float32",
36
+ "transformers_version": "4.38.0.dev0",
37
+ "use_cache": true,
38
+ "vocab_size": 50257
39
+ }
eval_results.json ADDED
@@ -0,0 +1,10 @@
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "epoch": 500.0,
3
+ "eval_accuracy": 0.4239980449657869,
4
+ "eval_loss": 3.9793689250946045,
5
+ "eval_runtime": 0.6297,
6
+ "eval_samples": 4,
7
+ "eval_samples_per_second": 6.352,
8
+ "eval_steps_per_second": 1.588,
9
+ "perplexity": 53.4832716246652
10
+ }
events.out.tfevents.1711986222.lambda-a6000.2759502.0 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:c2d6f8449884175d929c48c244fe44d7039e0cf007bdf044d8f4b2aba953bb63
3
+ size 245592
generation_config.json ADDED
@@ -0,0 +1,6 @@
 
 
 
 
 
 
 
1
+ {
2
+ "_from_model_config": true,
3
+ "bos_token_id": 0,
4
+ "eos_token_id": 0,
5
+ "transformers_version": "4.38.0.dev0"
6
+ }
merges.txt ADDED
The diff for this file is too large to render. See raw diff
 
model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:9bcbb408921d4631a3b79fcae6a1bf73ff6a72259ffaa1ebd8437b99c381fb21
3
+ size 3096165928
special_tokens_map.json ADDED
@@ -0,0 +1,23 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "bos_token": {
3
+ "content": "<|endoftext|>",
4
+ "lstrip": false,
5
+ "normalized": true,
6
+ "rstrip": false,
7
+ "single_word": false
8
+ },
9
+ "eos_token": {
10
+ "content": "<|endoftext|>",
11
+ "lstrip": false,
12
+ "normalized": true,
13
+ "rstrip": false,
14
+ "single_word": false
15
+ },
16
+ "unk_token": {
17
+ "content": "<|endoftext|>",
18
+ "lstrip": false,
19
+ "normalized": true,
20
+ "rstrip": false,
21
+ "single_word": false
22
+ }
23
+ }
tokenizer.json ADDED
The diff for this file is too large to render. See raw diff
 
tokenizer_config.json ADDED
@@ -0,0 +1,19 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "add_prefix_space": false,
3
+ "added_tokens_decoder": {
4
+ "0": {
5
+ "content": "<|endoftext|>",
6
+ "lstrip": false,
7
+ "normalized": true,
8
+ "rstrip": false,
9
+ "single_word": false,
10
+ "special": true
11
+ }
12
+ },
13
+ "bos_token": "<|endoftext|>",
14
+ "clean_up_tokenization_spaces": true,
15
+ "eos_token": "<|endoftext|>",
16
+ "model_max_length": 1000000000000000019884624838656,
17
+ "tokenizer_class": "GPT2Tokenizer",
18
+ "unk_token": "<|endoftext|>"
19
+ }
train_results.json ADDED
@@ -0,0 +1,8 @@
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "epoch": 500.0,
3
+ "train_loss": 3.948820646587171,
4
+ "train_runtime": 16522.6397,
5
+ "train_samples": 38,
6
+ "train_samples_per_second": 1.15,
7
+ "train_steps_per_second": 1.15
8
+ }
trainer_state.json ADDED
The diff for this file is too large to render. See raw diff
 
training_args.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:77118ead5d36b3dd8d023df36ff7df95689490320c2b4b7a7a3f8ca6943d5029
3
+ size 4728
vocab.json ADDED
The diff for this file is too large to render. See raw diff