Joy28 commited on
Commit
5835a5a
1 Parent(s): f363ca3

Model save

Browse files
Files changed (2) hide show
  1. README.md +160 -0
  2. model.safetensors +1 -1
README.md ADDED
@@ -0,0 +1,160 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: cc-by-nc-4.0
3
+ base_model: MCG-NJU/videomae-base
4
+ tags:
5
+ - generated_from_trainer
6
+ metrics:
7
+ - accuracy
8
+ model-index:
9
+ - name: videomae-base-finetuned-subset-100epochs
10
+ results: []
11
+ ---
12
+
13
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
14
+ should probably proofread and complete it, then remove this comment. -->
15
+
16
+ # videomae-base-finetuned-subset-100epochs
17
+
18
+ This model is a fine-tuned version of [MCG-NJU/videomae-base](https://huggingface.co/MCG-NJU/videomae-base) on an unknown dataset.
19
+ It achieves the following results on the evaluation set:
20
+ - Loss: 0.7077
21
+ - Accuracy: 0.7685
22
+
23
+ ## Model description
24
+
25
+ More information needed
26
+
27
+ ## Intended uses & limitations
28
+
29
+ More information needed
30
+
31
+ ## Training and evaluation data
32
+
33
+ More information needed
34
+
35
+ ## Training procedure
36
+
37
+ ### Training hyperparameters
38
+
39
+ The following hyperparameters were used during training:
40
+ - learning_rate: 5e-05
41
+ - train_batch_size: 8
42
+ - eval_batch_size: 8
43
+ - seed: 42
44
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
45
+ - lr_scheduler_type: linear
46
+ - lr_scheduler_warmup_ratio: 0.1
47
+ - training_steps: 5550
48
+
49
+ ### Training results
50
+
51
+ | Training Loss | Epoch | Step | Validation Loss | Accuracy |
52
+ |:-------------:|:-----:|:----:|:---------------:|:--------:|
53
+ | 1.6657 | 0.01 | 56 | 1.6248 | 0.2258 |
54
+ | 1.6109 | 1.01 | 112 | 1.5601 | 0.3917 |
55
+ | 1.5669 | 2.01 | 168 | 1.5563 | 0.3733 |
56
+ | 1.45 | 3.01 | 224 | 1.0988 | 0.5991 |
57
+ | 1.1208 | 4.01 | 280 | 1.2279 | 0.5714 |
58
+ | 1.1588 | 5.01 | 336 | 0.8424 | 0.7097 |
59
+ | 1.0834 | 6.01 | 392 | 1.1035 | 0.5346 |
60
+ | 1.2194 | 7.01 | 448 | 1.0749 | 0.4839 |
61
+ | 0.8462 | 8.01 | 504 | 0.8755 | 0.6406 |
62
+ | 1.058 | 9.01 | 560 | 0.9025 | 0.6498 |
63
+ | 1.0163 | 10.01 | 616 | 1.2588 | 0.4839 |
64
+ | 1.0639 | 11.01 | 672 | 0.8928 | 0.6359 |
65
+ | 0.9317 | 12.01 | 728 | 0.8825 | 0.6221 |
66
+ | 0.9038 | 13.01 | 784 | 0.8765 | 0.5622 |
67
+ | 0.9155 | 14.01 | 840 | 0.8431 | 0.7005 |
68
+ | 1.0731 | 15.01 | 896 | 0.8175 | 0.7005 |
69
+ | 0.6864 | 16.01 | 952 | 1.0591 | 0.5853 |
70
+ | 0.9537 | 17.01 | 1008 | 0.9703 | 0.6221 |
71
+ | 0.7499 | 18.01 | 1064 | 0.8371 | 0.5806 |
72
+ | 0.7142 | 19.01 | 1120 | 0.9132 | 0.6636 |
73
+ | 0.675 | 20.01 | 1176 | 0.7597 | 0.6728 |
74
+ | 0.604 | 21.01 | 1232 | 1.2004 | 0.5714 |
75
+ | 0.7738 | 22.01 | 1288 | 1.0633 | 0.5668 |
76
+ | 0.7651 | 23.01 | 1344 | 0.6865 | 0.6820 |
77
+ | 0.6292 | 24.01 | 1400 | 0.7607 | 0.6912 |
78
+ | 0.7387 | 25.01 | 1456 | 1.3038 | 0.5346 |
79
+ | 0.7038 | 26.01 | 1512 | 1.2832 | 0.5530 |
80
+ | 0.7565 | 27.01 | 1568 | 0.8128 | 0.7005 |
81
+ | 0.6516 | 28.01 | 1624 | 1.0893 | 0.5392 |
82
+ | 0.7074 | 29.01 | 1680 | 1.0894 | 0.5991 |
83
+ | 0.4902 | 30.01 | 1736 | 1.0695 | 0.5622 |
84
+ | 0.4563 | 31.01 | 1792 | 1.2922 | 0.5300 |
85
+ | 0.7543 | 32.01 | 1848 | 0.8960 | 0.6820 |
86
+ | 0.7467 | 33.01 | 1904 | 0.7861 | 0.7465 |
87
+ | 0.6459 | 34.01 | 1960 | 1.2835 | 0.5622 |
88
+ | 0.7296 | 35.01 | 2016 | 1.0303 | 0.5806 |
89
+ | 0.5 | 36.01 | 2072 | 0.8924 | 0.6129 |
90
+ | 0.5181 | 37.01 | 2128 | 0.8769 | 0.7235 |
91
+ | 0.5225 | 38.01 | 2184 | 0.7288 | 0.7512 |
92
+ | 0.5617 | 39.01 | 2240 | 0.6330 | 0.7926 |
93
+ | 0.677 | 40.01 | 2296 | 0.7733 | 0.7419 |
94
+ | 0.6891 | 41.01 | 2352 | 0.7463 | 0.8157 |
95
+ | 0.6662 | 42.01 | 2408 | 0.9304 | 0.7235 |
96
+ | 0.4602 | 43.01 | 2464 | 1.5115 | 0.5207 |
97
+ | 0.581 | 44.01 | 2520 | 1.2296 | 0.6175 |
98
+ | 0.5418 | 45.01 | 2576 | 1.0070 | 0.6221 |
99
+ | 0.5199 | 46.01 | 2632 | 1.1344 | 0.6083 |
100
+ | 0.6876 | 47.01 | 2688 | 0.9800 | 0.5760 |
101
+ | 0.5165 | 48.01 | 2744 | 1.3709 | 0.5069 |
102
+ | 0.5727 | 49.01 | 2800 | 0.9960 | 0.6866 |
103
+ | 0.3698 | 50.01 | 2856 | 1.2246 | 0.5484 |
104
+ | 0.5836 | 51.01 | 2912 | 0.9892 | 0.6866 |
105
+ | 0.6017 | 52.01 | 2968 | 0.9388 | 0.6590 |
106
+ | 0.4851 | 53.01 | 3024 | 1.1415 | 0.6590 |
107
+ | 0.3038 | 54.01 | 3080 | 0.9413 | 0.6959 |
108
+ | 0.6075 | 55.01 | 3136 | 1.0467 | 0.6129 |
109
+ | 0.4474 | 56.01 | 3192 | 0.8436 | 0.6866 |
110
+ | 0.3711 | 57.01 | 3248 | 0.8994 | 0.6774 |
111
+ | 0.5279 | 58.01 | 3304 | 0.8859 | 0.7189 |
112
+ | 0.6032 | 59.01 | 3360 | 1.2931 | 0.6498 |
113
+ | 0.3282 | 60.01 | 3416 | 0.9435 | 0.7143 |
114
+ | 0.3506 | 61.01 | 3472 | 1.0971 | 0.6728 |
115
+ | 0.3169 | 62.01 | 3528 | 0.9101 | 0.7512 |
116
+ | 0.438 | 63.01 | 3584 | 1.4072 | 0.6359 |
117
+ | 0.5208 | 64.01 | 3640 | 1.2648 | 0.6544 |
118
+ | 0.4563 | 65.01 | 3696 | 1.1162 | 0.6498 |
119
+ | 0.6693 | 66.01 | 3752 | 1.8558 | 0.5576 |
120
+ | 0.5599 | 67.01 | 3808 | 1.6574 | 0.5392 |
121
+ | 0.4751 | 68.01 | 3864 | 1.1883 | 0.6129 |
122
+ | 0.6489 | 69.01 | 3920 | 1.2733 | 0.6129 |
123
+ | 0.4229 | 70.01 | 3976 | 1.0994 | 0.6682 |
124
+ | 0.4194 | 71.01 | 4032 | 1.1464 | 0.6175 |
125
+ | 0.2121 | 72.01 | 4088 | 1.1798 | 0.6175 |
126
+ | 0.4106 | 73.01 | 4144 | 1.3294 | 0.5806 |
127
+ | 0.3962 | 74.01 | 4200 | 1.4209 | 0.6359 |
128
+ | 0.2963 | 75.01 | 4256 | 1.5016 | 0.5945 |
129
+ | 0.5436 | 76.01 | 4312 | 1.5647 | 0.5484 |
130
+ | 0.4115 | 77.01 | 4368 | 1.4309 | 0.6037 |
131
+ | 0.1635 | 78.01 | 4424 | 1.3660 | 0.6452 |
132
+ | 0.2931 | 79.01 | 4480 | 1.3299 | 0.6498 |
133
+ | 0.5154 | 80.01 | 4536 | 1.6550 | 0.5806 |
134
+ | 0.2993 | 81.01 | 4592 | 1.6520 | 0.5991 |
135
+ | 0.4391 | 82.01 | 4648 | 1.3823 | 0.6406 |
136
+ | 0.485 | 83.01 | 4704 | 1.4860 | 0.6037 |
137
+ | 0.3313 | 84.01 | 4760 | 1.3875 | 0.6175 |
138
+ | 0.4194 | 85.01 | 4816 | 1.4334 | 0.5899 |
139
+ | 0.4515 | 86.01 | 4872 | 1.6489 | 0.5991 |
140
+ | 0.3283 | 87.01 | 4928 | 1.4549 | 0.6083 |
141
+ | 0.1914 | 88.01 | 4984 | 1.3415 | 0.6267 |
142
+ | 0.2142 | 89.01 | 5040 | 1.6426 | 0.6267 |
143
+ | 0.3121 | 90.01 | 5096 | 1.6999 | 0.6037 |
144
+ | 0.367 | 91.01 | 5152 | 1.4683 | 0.6083 |
145
+ | 0.178 | 92.01 | 5208 | 1.4665 | 0.6267 |
146
+ | 0.3972 | 93.01 | 5264 | 1.3464 | 0.6452 |
147
+ | 0.224 | 94.01 | 5320 | 1.5009 | 0.6175 |
148
+ | 0.1848 | 95.01 | 5376 | 1.5068 | 0.6129 |
149
+ | 0.2776 | 96.01 | 5432 | 1.5383 | 0.6175 |
150
+ | 0.3506 | 97.01 | 5488 | 1.5356 | 0.6129 |
151
+ | 0.401 | 98.01 | 5544 | 1.5504 | 0.6175 |
152
+ | 0.3466 | 99.0 | 5550 | 1.5505 | 0.6175 |
153
+
154
+
155
+ ### Framework versions
156
+
157
+ - Transformers 4.36.2
158
+ - Pytorch 1.13.1
159
+ - Datasets 2.16.1
160
+ - Tokenizers 0.15.0
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:78d1d5bfa7ec60f7809165d99037c795fd287aa3aaf6ec30a56deebe07266bcc
3
  size 344946604
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:bd94865e65dcbb10fe3bf8531d517fe5b5db8a4b40aff15862d963011ced4fc0
3
  size 344946604