CTMAE-P2-V2-S5

This model is a fine-tuned version of MCG-NJU/videomae-large-finetuned-kinetics on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 1.2006
  • Accuracy: 0.75

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 1
  • eval_batch_size: 1
  • seed: 42
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • training_steps: 13050

Training results

Training Loss Epoch Step Validation Loss Accuracy
1.5874 0.02 261 2.2577 0.5682
0.581 1.02 522 2.4954 0.5682
1.5552 2.02 783 2.2144 0.5682
0.7597 3.02 1044 2.1388 0.5682
1.8176 4.02 1305 1.5857 0.5682
0.9596 5.02 1566 1.9454 0.5682
0.8402 6.02 1827 2.0550 0.5682
1.0823 7.02 2088 1.7864 0.5682
1.0229 8.02 2349 1.8592 0.5682
0.7113 9.02 2610 1.4045 0.5682
1.3068 10.02 2871 1.4536 0.5682
1.7964 11.02 3132 1.8695 0.5682
1.6925 12.02 3393 0.7860 0.5682
0.3966 13.02 3654 2.1610 0.5682
0.0112 14.02 3915 2.7138 0.5682
0.5847 15.02 4176 0.8433 0.7045
0.6547 16.02 4437 1.7384 0.6136
0.7854 17.02 4698 1.3477 0.6818
1.0052 18.02 4959 1.4197 0.7045
1.4927 19.02 5220 2.2046 0.6136
0.5386 20.02 5481 1.2006 0.75
0.7256 21.02 5742 1.5015 0.7273
0.8462 22.02 6003 1.6405 0.6591
0.64 23.02 6264 2.2160 0.5682
1.0358 24.02 6525 2.6674 0.5682
0.0003 25.02 6786 3.2237 0.5682
1.449 26.02 7047 2.9910 0.5455
0.6425 27.02 7308 2.9668 0.5682
0.0038 28.02 7569 3.2074 0.5455
0.4198 29.02 7830 3.4554 0.5455
0.0002 30.02 8091 2.2222 0.6591
0.0087 31.02 8352 2.7093 0.5455
0.2823 32.02 8613 2.8994 0.5909
0.0009 33.02 8874 2.9261 0.5909
0.0064 34.02 9135 2.4037 0.6818
0.7506 35.02 9396 2.8436 0.6364
0.6686 36.02 9657 3.1198 0.5682
0.0089 37.02 9918 2.2353 0.6591
0.6753 38.02 10179 3.0288 0.6364
0.0003 39.02 10440 2.4052 0.6591
0.295 40.02 10701 3.7579 0.5682
0.0002 41.02 10962 3.3831 0.5909
0.5379 42.02 11223 3.5119 0.5455
0.0001 43.02 11484 3.3207 0.5909
0.0001 44.02 11745 3.1331 0.6136
0.0002 45.02 12006 3.1938 0.5909
0.0001 46.02 12267 3.2387 0.5909
0.6632 47.02 12528 3.3889 0.5909
0.2849 48.02 12789 3.3584 0.6364
0.0001 49.02 13050 3.2970 0.6136

Framework versions

  • Transformers 4.46.2
  • Pytorch 2.0.1+cu117
  • Datasets 3.0.1
  • Tokenizers 0.20.0
Downloads last month
18
Safetensors
Model size
304M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Model tree for beingbatman/CTMAE-P2-V2-S5

Finetuned
(40)
this model