videomae-base-finetuned-subset-200epochs

This model is a fine-tuned version of MCG-NJU/videomae-base on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7635
  • Accuracy: 0.7407

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • training_steps: 11100

Training results

Training Loss Epoch Step Validation Loss Accuracy
0.6058 0.01 56 0.7442 0.7880
0.4908 1.01 112 0.7775 0.7558
0.5326 2.01 168 0.7973 0.7419
0.4768 3.01 224 0.8451 0.7281
0.4243 4.01 280 0.9361 0.6728
0.6921 5.01 336 0.8979 0.7097
0.3182 6.01 392 0.8852 0.7235
0.6085 7.01 448 0.9224 0.7097
0.4067 8.01 504 0.9631 0.6682
0.47 9.01 560 0.9193 0.7465
0.5058 10.01 616 0.8967 0.7650
0.4187 11.01 672 0.7403 0.7834
0.6033 12.01 728 1.0005 0.6221
0.5032 13.01 784 1.1420 0.5899
0.5967 14.01 840 1.2590 0.5484
0.3103 15.01 896 0.9723 0.6544
0.4201 16.01 952 1.1665 0.6406
0.6246 17.01 1008 1.2497 0.4977
0.6306 18.01 1064 1.3829 0.5668
0.4179 19.01 1120 1.0787 0.5806
0.5468 20.01 1176 1.1144 0.5714
0.4166 21.01 1232 0.7674 0.6912
0.3844 22.01 1288 0.9260 0.6959
0.5138 23.01 1344 0.9093 0.7097
0.792 24.01 1400 0.7327 0.7465
0.5944 25.01 1456 0.8933 0.7650
0.4855 26.01 1512 0.9830 0.6636
0.6896 27.01 1568 0.7896 0.6590
0.3617 28.01 1624 0.8900 0.6544
0.6362 29.01 1680 1.0237 0.6912
0.6475 30.01 1736 1.1399 0.6037
0.5088 31.01 1792 0.7190 0.7742
0.7271 32.01 1848 0.9492 0.6359
0.3171 33.01 1904 0.9431 0.7281
0.5847 34.01 1960 0.7997 0.7235
0.4703 35.01 2016 0.9506 0.7051
0.4995 36.01 2072 1.0830 0.7005
0.5682 37.01 2128 1.0100 0.7005
0.6424 38.01 2184 0.9587 0.6452
0.5897 39.01 2240 0.8807 0.7097
0.5222 40.01 2296 1.1219 0.6682
0.5239 41.01 2352 1.0848 0.6406
0.5957 42.01 2408 0.9640 0.6866
0.5279 43.01 2464 1.0291 0.5853
0.3545 44.01 2520 0.8908 0.6636
0.6066 45.01 2576 1.2505 0.6406
0.3658 46.01 2632 0.8362 0.6866
0.5454 47.01 2688 1.3975 0.5622
0.5956 48.01 2744 0.8236 0.6590
0.4107 49.01 2800 1.2610 0.6267
0.462 50.01 2856 1.2553 0.6406
0.4837 51.01 2912 1.0389 0.6359
0.621 52.01 2968 0.8281 0.7235
0.4293 53.01 3024 1.0426 0.6267
0.4255 54.01 3080 1.2942 0.5806
0.5607 55.01 3136 1.1234 0.6498
0.3104 56.01 3192 1.0643 0.6590
0.3335 57.01 3248 1.2160 0.6590
0.4232 58.01 3304 1.3532 0.5806
0.6238 59.01 3360 0.9208 0.7005
0.369 60.01 3416 1.2186 0.5530
0.3874 61.01 3472 1.1746 0.6452
0.3421 62.01 3528 1.2017 0.5945
0.4243 63.01 3584 1.0288 0.6728
0.2806 64.01 3640 0.8483 0.7419
0.5357 65.01 3696 1.0890 0.6359
0.5155 66.01 3752 1.1885 0.6359
0.4367 67.01 3808 1.0738 0.6820
0.48 68.01 3864 1.0894 0.6866
0.4703 69.01 3920 1.2252 0.6498
0.4531 70.01 3976 1.0584 0.6498
0.2898 71.01 4032 1.7486 0.5576
0.3684 72.01 4088 1.0524 0.6406
0.2752 73.01 4144 1.2744 0.6728
0.3092 74.01 4200 1.3918 0.5806
0.3507 75.01 4256 1.4599 0.6544
0.4722 76.01 4312 1.0549 0.7143
0.4059 77.01 4368 1.2727 0.6728
0.2734 78.01 4424 1.1258 0.6959
0.4168 79.01 4480 0.9788 0.7189
0.4456 80.01 4536 1.4757 0.6544
0.4519 81.01 4592 1.2796 0.6820
0.5283 82.01 4648 1.2542 0.7051
0.4738 83.01 4704 1.2781 0.6083
0.2128 84.01 4760 1.0077 0.6866
0.3262 85.01 4816 1.0287 0.6820
0.3631 86.01 4872 1.3574 0.6544
0.4085 87.01 4928 1.1976 0.7235
0.3582 88.01 4984 1.4126 0.6544
0.3564 89.01 5040 1.3488 0.6406
0.4207 90.01 5096 1.0565 0.7005
0.4307 91.01 5152 0.9833 0.7281
0.3863 92.01 5208 0.9340 0.6912
0.2949 93.01 5264 0.9835 0.7143
0.2957 94.01 5320 1.1397 0.7235
0.3767 95.01 5376 1.4135 0.6221
0.4949 96.01 5432 1.0483 0.7189
0.3058 97.01 5488 1.8241 0.5530
0.3406 98.01 5544 1.7386 0.5760
0.2319 99.01 5600 1.4739 0.6175
0.5261 100.01 5656 1.0822 0.7143
0.4181 101.01 5712 1.2876 0.6728
0.243 102.01 5768 1.0783 0.7235
0.2603 103.01 5824 1.4557 0.6129
0.4892 104.01 5880 1.2557 0.6912
0.3073 105.01 5936 1.3899 0.5991
0.3601 106.01 5992 1.2048 0.6820
0.4371 107.01 6048 1.3645 0.6866
0.5712 108.01 6104 1.2281 0.6636
0.3697 109.01 6160 1.4402 0.6544
0.2978 110.01 6216 1.3769 0.6912
0.303 111.01 6272 1.3096 0.6959
0.4606 112.01 6328 1.2236 0.7005
0.2554 113.01 6384 1.2662 0.6959
0.3033 114.01 6440 1.2476 0.6406
0.3025 115.01 6496 1.0474 0.7143
0.3513 116.01 6552 1.4692 0.6452
0.4205 117.01 6608 1.2675 0.6912
0.3898 118.01 6664 1.4018 0.6590
0.2184 119.01 6720 1.2402 0.6959
0.319 120.01 6776 1.0747 0.7097
0.2455 121.01 6832 1.3515 0.7051
0.2138 122.01 6888 1.5175 0.6682
0.3805 123.01 6944 1.4817 0.6820
0.3942 124.01 7000 1.5235 0.6221
0.2207 125.01 7056 1.6295 0.5945
0.2217 126.01 7112 1.3348 0.6912
0.3173 127.01 7168 1.3566 0.7097
0.4952 128.01 7224 1.2188 0.7327
0.3238 129.01 7280 1.2574 0.7143
0.1525 130.01 7336 1.5508 0.6313
0.2518 131.01 7392 1.3058 0.6912
0.4523 132.01 7448 1.7539 0.6313
0.3732 133.01 7504 1.4478 0.6820
0.2432 134.01 7560 1.3595 0.6912
0.2798 135.01 7616 1.5007 0.6866
0.3436 136.01 7672 1.3162 0.7465
0.3033 137.01 7728 1.3700 0.7051
0.3457 138.01 7784 1.1052 0.7465
0.1381 139.01 7840 1.5786 0.6959
0.3067 140.01 7896 1.5155 0.6912
0.269 141.01 7952 1.2751 0.7512
0.2646 142.01 8008 1.6017 0.6774
0.3933 143.01 8064 1.4294 0.7005
0.6315 144.01 8120 1.3814 0.6866
0.2814 145.01 8176 1.1689 0.7512
0.2749 146.01 8232 1.3208 0.7005
0.3966 147.01 8288 1.2817 0.7189
0.1787 148.01 8344 1.4568 0.7189
0.3006 149.01 8400 1.3312 0.7143
0.2871 150.01 8456 1.5808 0.6452
0.2018 151.01 8512 1.6682 0.6267
0.2698 152.01 8568 1.4281 0.6590
0.162 153.01 8624 1.4369 0.7051
0.3961 154.01 8680 1.3771 0.7143
0.4034 155.01 8736 1.5444 0.6452
0.2462 156.01 8792 1.4677 0.6728
0.2564 157.01 8848 1.6085 0.6590
0.2905 158.01 8904 1.3037 0.6912
0.2762 159.01 8960 1.3974 0.7051
0.1604 160.01 9016 1.5176 0.6959
0.2399 161.01 9072 1.4504 0.7143
0.3398 162.01 9128 1.4675 0.6728
0.2495 163.01 9184 1.3757 0.7005
0.3076 164.01 9240 1.3699 0.7051
0.2491 165.01 9296 1.4333 0.7005
0.1666 166.01 9352 1.6465 0.6313
0.1871 167.01 9408 1.6614 0.6544
0.2169 168.01 9464 1.8141 0.6175
0.3918 169.01 9520 1.3402 0.7097
0.2697 170.01 9576 1.4295 0.6774
0.2261 171.01 9632 1.5952 0.6452
0.1894 172.01 9688 1.5468 0.6590
0.1714 173.01 9744 1.4434 0.6636
0.3137 174.01 9800 1.5525 0.6313
0.267 175.01 9856 1.6447 0.6452
0.0797 176.01 9912 1.5593 0.6682
0.2698 177.01 9968 1.3952 0.7005
0.1364 178.01 10024 1.6720 0.6498
0.2342 179.01 10080 1.6315 0.6682
0.1909 180.01 10136 1.5374 0.7051
0.2234 181.01 10192 1.5861 0.7097
0.3425 182.01 10248 1.5664 0.6912
0.4092 183.01 10304 1.6135 0.6774
0.2427 184.01 10360 1.5366 0.6866
0.3751 185.01 10416 1.5561 0.6959
0.1831 186.01 10472 1.6049 0.7005
0.2207 187.01 10528 1.6072 0.6959
0.1096 188.01 10584 1.5016 0.7097
0.2417 189.01 10640 1.5027 0.7097
0.2974 190.01 10696 1.4897 0.7097
0.2296 191.01 10752 1.4927 0.7235
0.3323 192.01 10808 1.4947 0.7235
0.3002 193.01 10864 1.5225 0.7143
0.23 194.01 10920 1.4965 0.7189
0.3147 195.01 10976 1.5123 0.7051
0.1344 196.01 11032 1.5192 0.7051
0.1843 197.01 11088 1.5235 0.7097
0.1902 198.0 11100 1.5238 0.7097

Framework versions

  • Transformers 4.36.2
  • Pytorch 1.13.1
  • Datasets 2.16.1
  • Tokenizers 0.15.0
Downloads last month
18
Safetensors
Model size
86.2M params
Tensor type
F32
·
Inference API
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for Joy28/videomae-base-finetuned-subset-200epochs

Finetuned
(421)
this model