CTMAE-P2-V2-S4

This model is a fine-tuned version of MCG-NJU/videomae-large-finetuned-kinetics on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 1.4945
  • Accuracy: 0.6667

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 2
  • eval_batch_size: 2
  • seed: 42
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • training_steps: 6500

Training results

Training Loss Epoch Step Validation Loss Accuracy
0.6454 0.0202 131 0.8373 0.5556
0.4683 1.0202 262 2.0593 0.5556
1.185 2.0202 393 2.0221 0.5556
0.6278 3.0202 524 0.8492 0.5556
1.3943 4.0202 655 1.5432 0.5556
0.8242 5.0202 786 1.6719 0.5556
0.8047 6.0202 917 1.7035 0.5556
1.4137 7.0202 1048 0.9401 0.5333
0.763 8.0202 1179 1.8927 0.5556
0.7333 9.0202 1310 0.9205 0.6
0.7436 10.0202 1441 1.5193 0.6222
0.3139 11.0202 1572 1.1921 0.6222
0.3918 12.0202 1703 0.7039 0.6222
0.667 13.0202 1834 0.8153 0.6222
0.967 14.0202 1965 1.5073 0.5556
0.0771 15.0202 2096 1.5141 0.5778
0.5622 16.0202 2227 1.5615 0.6
1.0141 17.0202 2358 1.6540 0.6222
2.1512 18.0202 2489 1.2866 0.6444
1.0352 19.0202 2620 1.9383 0.6
0.2068 20.0202 2751 1.8477 0.6
0.2804 21.0202 2882 1.4945 0.6667
0.4271 22.0202 3013 1.8007 0.6222
0.8258 23.0202 3144 1.8842 0.5778
0.6762 24.0202 3275 1.8649 0.6222
1.1151 25.0202 3406 2.7759 0.5333
0.1755 26.0202 3537 2.0492 0.6444
0.4496 27.0202 3668 2.2949 0.5556
0.4336 28.0202 3799 2.3240 0.5333
0.9503 29.0202 3930 2.0642 0.6222
0.4413 30.0202 4061 2.3833 0.5556
0.0132 31.0202 4192 2.5514 0.5778
0.2513 32.0202 4323 2.4046 0.5778
0.4845 33.0202 4454 2.5703 0.6
0.8916 34.0202 4585 2.5372 0.6
0.3173 35.0202 4716 2.6754 0.6
0.4552 36.0202 4847 2.6613 0.5778
0.0155 37.0202 4978 2.4057 0.6222
0.6358 38.0202 5109 2.4891 0.6
0.834 39.0202 5240 2.6045 0.6222
0.0008 40.0202 5371 2.5713 0.6222
0.523 41.0202 5502 2.6842 0.5778
0.5313 42.0202 5633 2.7778 0.5556
0.0002 43.0202 5764 2.5852 0.6222
0.0002 44.0202 5895 2.2989 0.6444
0.0004 45.0202 6026 2.4952 0.5778
0.1427 46.0202 6157 2.2855 0.6667
0.1096 47.0202 6288 2.5532 0.5778
0.0048 48.0202 6419 2.5509 0.5778
0.0004 49.0125 6500 2.5656 0.5778

Framework versions

  • Transformers 4.46.2
  • Pytorch 2.0.1+cu117
  • Datasets 3.0.1
  • Tokenizers 0.20.0
Downloads last month
23
Safetensors
Model size
304M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Model tree for beingbatman/CTMAE-P2-V2-S4

Finetuned
(40)
this model