|
--- |
|
license: cc-by-nc-4.0 |
|
base_model: MCG-NJU/videomae-base |
|
tags: |
|
- generated_from_trainer |
|
metrics: |
|
- accuracy |
|
model-index: |
|
- name: videomae-base-finetuned-subset |
|
results: [] |
|
--- |
|
|
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You |
|
should probably proofread and complete it, then remove this comment. --> |
|
|
|
# videomae-base-finetuned-subset |
|
|
|
This model is a fine-tuned version of [MCG-NJU/videomae-base](https://huggingface.co./MCG-NJU/videomae-base) on an unknown dataset. |
|
It achieves the following results on the evaluation set: |
|
- Loss: 1.0747 |
|
- Accuracy: 0.6991 |
|
|
|
## Model description |
|
|
|
More information needed |
|
|
|
## Intended uses & limitations |
|
|
|
More information needed |
|
|
|
## Training and evaluation data |
|
|
|
More information needed |
|
|
|
## Training procedure |
|
|
|
### Training hyperparameters |
|
|
|
The following hyperparameters were used during training: |
|
- learning_rate: 5e-05 |
|
- train_batch_size: 4 |
|
- eval_batch_size: 4 |
|
- seed: 42 |
|
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 |
|
- lr_scheduler_type: linear |
|
- lr_scheduler_warmup_ratio: 0.1 |
|
- training_steps: 7020 |
|
|
|
### Training results |
|
|
|
| Training Loss | Epoch | Step | Validation Loss | Accuracy | |
|
|:-------------:|:-----:|:----:|:---------------:|:--------:| |
|
| 0.9225 | 0.02 | 118 | 0.7363 | 0.7419 | |
|
| 0.8357 | 1.02 | 236 | 0.9119 | 0.7005 | |
|
| 0.474 | 2.02 | 354 | 0.9698 | 0.6820 | |
|
| 0.7899 | 3.02 | 472 | 1.1351 | 0.6774 | |
|
| 0.9015 | 4.02 | 590 | 1.3823 | 0.4977 | |
|
| 0.7402 | 5.02 | 708 | 0.8661 | 0.6959 | |
|
| 0.6343 | 6.02 | 826 | 0.6689 | 0.7005 | |
|
| 0.7427 | 7.02 | 944 | 0.9109 | 0.6728 | |
|
| 0.5898 | 8.02 | 1062 | 1.0127 | 0.5945 | |
|
| 0.6258 | 9.02 | 1180 | 0.7131 | 0.7235 | |
|
| 0.9957 | 10.02 | 1298 | 0.9507 | 0.6728 | |
|
| 0.401 | 11.02 | 1416 | 0.6259 | 0.7189 | |
|
| 0.5422 | 12.02 | 1534 | 0.9453 | 0.6774 | |
|
| 0.6852 | 13.02 | 1652 | 0.8649 | 0.7005 | |
|
| 0.8469 | 14.02 | 1770 | 0.9379 | 0.6912 | |
|
| 0.8492 | 15.02 | 1888 | 0.9003 | 0.6452 | |
|
| 0.7633 | 16.02 | 2006 | 0.7601 | 0.7235 | |
|
| 0.6063 | 17.02 | 2124 | 0.6181 | 0.7788 | |
|
| 0.6436 | 18.02 | 2242 | 0.9445 | 0.6313 | |
|
| 0.8931 | 19.02 | 2360 | 0.8515 | 0.7281 | |
|
| 0.8599 | 20.02 | 2478 | 1.0786 | 0.6359 | |
|
| 0.5183 | 21.02 | 2596 | 0.9481 | 0.6866 | |
|
| 0.7982 | 22.02 | 2714 | 0.8364 | 0.7235 | |
|
| 1.0003 | 23.02 | 2832 | 0.7811 | 0.7327 | |
|
| 0.6666 | 24.02 | 2950 | 0.7552 | 0.7465 | |
|
| 0.8527 | 25.02 | 3068 | 0.8201 | 0.7189 | |
|
| 0.4678 | 26.02 | 3186 | 1.0260 | 0.6959 | |
|
| 0.7354 | 27.02 | 3304 | 0.8520 | 0.6866 | |
|
| 1.1097 | 28.02 | 3422 | 0.9239 | 0.7327 | |
|
| 0.6264 | 29.02 | 3540 | 0.6894 | 0.7558 | |
|
| 0.3348 | 30.02 | 3658 | 0.6230 | 0.8065 | |
|
| 0.5548 | 31.02 | 3776 | 0.6431 | 0.8203 | |
|
| 0.4242 | 32.02 | 3894 | 0.8081 | 0.7051 | |
|
| 0.5805 | 33.02 | 4012 | 0.5598 | 0.8203 | |
|
| 0.7064 | 34.02 | 4130 | 0.7341 | 0.7926 | |
|
| 0.2534 | 35.02 | 4248 | 0.6685 | 0.7834 | |
|
| 0.7578 | 36.02 | 4366 | 0.7592 | 0.7604 | |
|
| 0.5822 | 37.02 | 4484 | 0.9472 | 0.7281 | |
|
| 0.2939 | 38.02 | 4602 | 0.8888 | 0.7281 | |
|
| 0.4795 | 39.02 | 4720 | 1.0768 | 0.6636 | |
|
| 0.4038 | 40.02 | 4838 | 0.6452 | 0.8065 | |
|
| 0.8347 | 41.02 | 4956 | 0.7040 | 0.7926 | |
|
| 0.4113 | 42.02 | 5074 | 0.8012 | 0.7373 | |
|
| 0.3681 | 43.02 | 5192 | 0.7622 | 0.7880 | |
|
| 1.0092 | 44.02 | 5310 | 0.7932 | 0.7880 | |
|
| 0.321 | 45.02 | 5428 | 0.9069 | 0.7373 | |
|
| 0.399 | 46.02 | 5546 | 0.6439 | 0.8111 | |
|
| 0.3699 | 47.02 | 5664 | 0.7740 | 0.7696 | |
|
| 0.4297 | 48.02 | 5782 | 0.6811 | 0.8249 | |
|
| 0.2783 | 49.02 | 5900 | 0.5868 | 0.8525 | |
|
| 0.4946 | 50.02 | 6018 | 0.6732 | 0.7926 | |
|
| 0.3058 | 51.02 | 6136 | 0.5511 | 0.8341 | |
|
| 0.1286 | 52.02 | 6254 | 0.5877 | 0.8295 | |
|
| 0.2013 | 53.02 | 6372 | 0.6508 | 0.8157 | |
|
| 0.2027 | 54.02 | 6490 | 0.6630 | 0.8157 | |
|
| 0.6267 | 55.02 | 6608 | 0.7373 | 0.8065 | |
|
| 0.4561 | 56.02 | 6726 | 0.7383 | 0.8018 | |
|
| 0.7002 | 57.02 | 6844 | 0.7073 | 0.8111 | |
|
| 0.1823 | 58.02 | 6962 | 0.6871 | 0.8203 | |
|
| 0.2439 | 59.01 | 7020 | 0.6901 | 0.8203 | |
|
|
|
|
|
### Framework versions |
|
|
|
- Transformers 4.36.2 |
|
- Pytorch 1.13.1 |
|
- Datasets 2.16.1 |
|
- Tokenizers 0.15.0 |
|
|