dahe827's picture
End of training
0aa20ca verified
|
raw
history blame
4.21 kB
---
license: mit
base_model: xlnet/xlnet-base-cased
tags:
- generated_from_trainer
metrics:
- f1
model-index:
- name: xlnet-base-cased-airlines-news-multi-label
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# xlnet-base-cased-airlines-news-multi-label
This model is a fine-tuned version of [xlnet/xlnet-base-cased](https://huggingface.co./xlnet/xlnet-base-cased) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2567
- F1: 0.8989
- Roc Auc: 0.6475
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 7e-05
- train_batch_size: 12
- eval_batch_size: 12
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 150
- num_epochs: 40
### Training results
| Training Loss | Epoch | Step | Validation Loss | F1 | Roc Auc |
|:-------------:|:-----:|:----:|:---------------:|:------:|:-------:|
| No log | 1.0 | 170 | 0.2834 | 0.8827 | 0.5576 |
| No log | 2.0 | 340 | 0.2670 | 0.8918 | 0.6102 |
| 0.2923 | 3.0 | 510 | 0.2567 | 0.8989 | 0.6475 |
| 0.2923 | 4.0 | 680 | 0.2526 | 0.8910 | 0.6685 |
| 0.2923 | 5.0 | 850 | 0.2512 | 0.8825 | 0.6352 |
| 0.2571 | 6.0 | 1020 | 0.2514 | 0.8863 | 0.6708 |
| 0.2571 | 7.0 | 1190 | 0.2454 | 0.8872 | 0.6490 |
| 0.2571 | 8.0 | 1360 | 0.2495 | 0.8884 | 0.6660 |
| 0.2468 | 9.0 | 1530 | 0.2467 | 0.8881 | 0.6725 |
| 0.2468 | 10.0 | 1700 | 0.2554 | 0.8815 | 0.6514 |
| 0.2468 | 11.0 | 1870 | 0.2474 | 0.8883 | 0.6603 |
| 0.2363 | 12.0 | 2040 | 0.2478 | 0.8912 | 0.6943 |
| 0.2363 | 13.0 | 2210 | 0.2492 | 0.8964 | 0.6976 |
| 0.2363 | 14.0 | 2380 | 0.2530 | 0.8936 | 0.7121 |
| 0.2332 | 15.0 | 2550 | 0.2497 | 0.8893 | 0.6830 |
| 0.2332 | 16.0 | 2720 | 0.2483 | 0.8922 | 0.7008 |
| 0.2332 | 17.0 | 2890 | 0.2489 | 0.8905 | 0.6782 |
| 0.23 | 18.0 | 3060 | 0.2496 | 0.8877 | 0.6927 |
| 0.23 | 19.0 | 3230 | 0.2494 | 0.8855 | 0.6652 |
| 0.23 | 20.0 | 3400 | 0.2483 | 0.8929 | 0.6903 |
| 0.2246 | 21.0 | 3570 | 0.2503 | 0.8902 | 0.6838 |
| 0.2246 | 22.0 | 3740 | 0.2506 | 0.8854 | 0.6805 |
| 0.2246 | 23.0 | 3910 | 0.2515 | 0.8900 | 0.6887 |
| 0.2214 | 24.0 | 4080 | 0.2501 | 0.8894 | 0.6773 |
| 0.2214 | 25.0 | 4250 | 0.2528 | 0.8878 | 0.6870 |
| 0.2214 | 26.0 | 4420 | 0.2519 | 0.8918 | 0.6895 |
| 0.2203 | 27.0 | 4590 | 0.2558 | 0.8897 | 0.6830 |
| 0.2203 | 28.0 | 4760 | 0.2554 | 0.8919 | 0.6952 |
| 0.2203 | 29.0 | 4930 | 0.2537 | 0.8957 | 0.7025 |
| 0.2193 | 30.0 | 5100 | 0.2513 | 0.8951 | 0.7025 |
| 0.2193 | 31.0 | 5270 | 0.2565 | 0.8946 | 0.7130 |
| 0.2193 | 32.0 | 5440 | 0.2542 | 0.8935 | 0.6960 |
| 0.2178 | 33.0 | 5610 | 0.2545 | 0.8970 | 0.7090 |
| 0.2178 | 34.0 | 5780 | 0.2546 | 0.8960 | 0.7138 |
| 0.2178 | 35.0 | 5950 | 0.2550 | 0.8942 | 0.7073 |
| 0.2173 | 36.0 | 6120 | 0.2545 | 0.8942 | 0.7073 |
| 0.2173 | 37.0 | 6290 | 0.2537 | 0.8925 | 0.7008 |
| 0.2173 | 38.0 | 6460 | 0.2541 | 0.8942 | 0.7073 |
| 0.2164 | 39.0 | 6630 | 0.2537 | 0.8940 | 0.7016 |
| 0.2164 | 40.0 | 6800 | 0.2540 | 0.8942 | 0.7073 |
### Framework versions
- Transformers 4.41.1
- Pytorch 2.3.0+cu121
- Datasets 2.19.2
- Tokenizers 0.19.1