File size: 10,056 Bytes
2011305
 
 
 
c7bbef8
2011305
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
c7bbef8
 
 
 
2011305
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
---
license: mit
tags:
- generated_from_trainer
base_model: xlnet/xlnet-base-cased
metrics:
- accuracy
- precision
- recall
model-index:
- name: xlnet-base-cased
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# xlnet-base-cased

This model is a fine-tuned version of [xlnet/xlnet-base-cased](https://huggingface.co./xlnet/xlnet-base-cased) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 1.6908
- Accuracy: 0.8273
- Precision: 0.8307
- Recall: 0.8273
- Precision Macro: 0.7836
- Recall Macro: 0.7606
- Macro Fpr: 0.0159
- Weighted Fpr: 0.0152
- Weighted Specificity: 0.9756
- Macro Specificity: 0.9865
- Weighted Sensitivity: 0.8218
- Macro Sensitivity: 0.7606
- F1 Micro: 0.8218
- F1 Macro: 0.7664
- F1 Weighted: 0.8189

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 30
- mixed_precision_training: Native AMP

### Training results

| Training Loss | Epoch | Step  | Validation Loss | Accuracy | Precision | Recall | Precision Macro | Recall Macro | Macro Fpr | Weighted Fpr | Weighted Specificity | Macro Specificity | Weighted Sensitivity | Macro Sensitivity | F1 Micro | F1 Macro | F1 Weighted |
|:-------------:|:-----:|:-----:|:---------------:|:--------:|:---------:|:------:|:---------------:|:------------:|:---------:|:------------:|:--------------------:|:-----------------:|:--------------------:|:-----------------:|:--------:|:--------:|:-----------:|
| 1.2613        | 1.0   | 643   | 0.7758          | 0.7676   | 0.7673    | 0.7676 | 0.5269          | 0.5129       | 0.0220    | 0.0212       | 0.9680               | 0.9824            | 0.7676               | 0.5129            | 0.7676   | 0.4819   | 0.7524      |
| 0.7364        | 2.0   | 1286  | 0.6755          | 0.8071   | 0.8088    | 0.8071 | 0.7425          | 0.6972       | 0.0174    | 0.0168       | 0.9751               | 0.9855            | 0.8071               | 0.6972            | 0.8071   | 0.7019   | 0.8013      |
| 0.6021        | 3.0   | 1929  | 0.8443          | 0.8064   | 0.8016    | 0.8064 | 0.7270          | 0.7262       | 0.0176    | 0.0169       | 0.9718               | 0.9852            | 0.8064               | 0.7262            | 0.8064   | 0.7229   | 0.8014      |
| 0.4361        | 4.0   | 2572  | 0.8850          | 0.8002   | 0.8001    | 0.8002 | 0.7167          | 0.7048       | 0.0180    | 0.0175       | 0.9731               | 0.9849            | 0.8002               | 0.7048            | 0.8002   | 0.7051   | 0.7971      |
| 0.3359        | 5.0   | 3215  | 1.1264          | 0.8017   | 0.7981    | 0.8017 | 0.6531          | 0.6681       | 0.0181    | 0.0174       | 0.9732               | 0.9850            | 0.8017               | 0.6681            | 0.8017   | 0.6459   | 0.7962      |
| 0.2827        | 6.0   | 3858  | 1.1471          | 0.7994   | 0.8092    | 0.7994 | 0.7389          | 0.6922       | 0.0183    | 0.0176       | 0.9686               | 0.9845            | 0.7994               | 0.6922            | 0.7994   | 0.7042   | 0.7952      |
| 0.1945        | 7.0   | 4501  | 1.1841          | 0.8149   | 0.8129    | 0.8149 | 0.7850          | 0.7598       | 0.0166    | 0.0160       | 0.9746               | 0.9860            | 0.8149               | 0.7598            | 0.8149   | 0.7667   | 0.8122      |
| 0.1286        | 8.0   | 5144  | 1.3231          | 0.8079   | 0.8105    | 0.8079 | 0.7630          | 0.7216       | 0.0171    | 0.0167       | 0.9757               | 0.9856            | 0.8079               | 0.7216            | 0.8079   | 0.7283   | 0.8067      |
| 0.1304        | 9.0   | 5787  | 1.3869          | 0.8102   | 0.8118    | 0.8102 | 0.7705          | 0.7603       | 0.0171    | 0.0165       | 0.9741               | 0.9856            | 0.8102               | 0.7603            | 0.8102   | 0.7570   | 0.8088      |
| 0.0875        | 10.0  | 6430  | 1.6901          | 0.7823   | 0.7932    | 0.7823 | 0.7601          | 0.7020       | 0.0199    | 0.0195       | 0.9680               | 0.9834            | 0.7823               | 0.7020            | 0.7823   | 0.7192   | 0.7817      |
| 0.1075        | 11.0  | 7073  | 1.6517          | 0.7978   | 0.8021    | 0.7978 | 0.7513          | 0.7567       | 0.0183    | 0.0178       | 0.9758               | 0.9849            | 0.7978               | 0.7567            | 0.7978   | 0.7470   | 0.7935      |
| 0.0632        | 12.0  | 7716  | 1.5290          | 0.8149   | 0.8184    | 0.8149 | 0.7746          | 0.7772       | 0.0167    | 0.0160       | 0.9738               | 0.9859            | 0.8149               | 0.7772            | 0.8149   | 0.7707   | 0.8150      |
| 0.0565        | 13.0  | 8359  | 1.5766          | 0.8064   | 0.8107    | 0.8064 | 0.7528          | 0.7628       | 0.0174    | 0.0169       | 0.9769               | 0.9856            | 0.8064               | 0.7628            | 0.8064   | 0.7537   | 0.8061      |
| 0.0504        | 14.0  | 9002  | 1.7548          | 0.8048   | 0.8100    | 0.8048 | 0.7569          | 0.7702       | 0.0174    | 0.0170       | 0.9765               | 0.9854            | 0.8048               | 0.7702            | 0.8048   | 0.7553   | 0.8046      |
| 0.0295        | 15.0  | 9645  | 1.7570          | 0.8102   | 0.8226    | 0.8102 | 0.7705          | 0.7611       | 0.0168    | 0.0165       | 0.9770               | 0.9858            | 0.8102               | 0.7611            | 0.8102   | 0.7610   | 0.8141      |
| 0.0338        | 16.0  | 10288 | 1.7394          | 0.8110   | 0.8138    | 0.8110 | 0.7639          | 0.7659       | 0.0168    | 0.0164       | 0.9775               | 0.9859            | 0.8110               | 0.7659            | 0.8110   | 0.7613   | 0.8100      |
| 0.0444        | 17.0  | 10931 | 1.7975          | 0.8118   | 0.8201    | 0.8118 | 0.7511          | 0.7610       | 0.0168    | 0.0163       | 0.9775               | 0.9859            | 0.8118               | 0.7610            | 0.8118   | 0.7457   | 0.8129      |
| 0.0397        | 18.0  | 11574 | 1.6921          | 0.8149   | 0.8203    | 0.8149 | 0.7540          | 0.7854       | 0.0165    | 0.0160       | 0.9780               | 0.9862            | 0.8149               | 0.7854            | 0.8149   | 0.7553   | 0.8130      |
| 0.0356        | 19.0  | 12217 | 1.6908          | 0.8273   | 0.8307    | 0.8273 | 0.7764          | 0.7992       | 0.0152    | 0.0147       | 0.9784               | 0.9870            | 0.8273               | 0.7992            | 0.8273   | 0.7814   | 0.8265      |
| 0.0306        | 20.0  | 12860 | 1.8374          | 0.8180   | 0.8208    | 0.8180 | 0.7635          | 0.7756       | 0.0162    | 0.0156       | 0.9771               | 0.9863            | 0.8180               | 0.7756            | 0.8180   | 0.7620   | 0.8166      |
| 0.0234        | 21.0  | 13503 | 1.7738          | 0.8195   | 0.8185    | 0.8195 | 0.7947          | 0.7602       | 0.0160    | 0.0155       | 0.9760               | 0.9864            | 0.8195               | 0.7602            | 0.8195   | 0.7713   | 0.8174      |
| 0.0091        | 22.0  | 14146 | 1.8537          | 0.8172   | 0.8167    | 0.8172 | 0.7732          | 0.7646       | 0.0163    | 0.0157       | 0.9764               | 0.9862            | 0.8172               | 0.7646            | 0.8172   | 0.7654   | 0.8143      |
| 0.0138        | 23.0  | 14789 | 1.8306          | 0.8102   | 0.8173    | 0.8102 | 0.7729          | 0.7569       | 0.0167    | 0.0165       | 0.9757               | 0.9857            | 0.8102               | 0.7569            | 0.8102   | 0.7625   | 0.8125      |
| 0.0213        | 24.0  | 15432 | 1.9363          | 0.8125   | 0.8149    | 0.8125 | 0.7777          | 0.7540       | 0.0168    | 0.0162       | 0.9739               | 0.9858            | 0.8125               | 0.7540            | 0.8125   | 0.7622   | 0.8115      |
| 0.0034        | 25.0  | 16075 | 1.9552          | 0.8156   | 0.8179    | 0.8156 | 0.7843          | 0.7583       | 0.0165    | 0.0159       | 0.9740               | 0.9860            | 0.8156               | 0.7583            | 0.8156   | 0.7657   | 0.8147      |
| 0.0028        | 26.0  | 16718 | 1.9404          | 0.8172   | 0.8163    | 0.8172 | 0.7884          | 0.7591       | 0.0164    | 0.0157       | 0.9747               | 0.9861            | 0.8172               | 0.7591            | 0.8172   | 0.7656   | 0.8137      |
| 0.0105        | 27.0  | 17361 | 1.9156          | 0.8180   | 0.8132    | 0.8180 | 0.7848          | 0.7575       | 0.0164    | 0.0156       | 0.9742               | 0.9861            | 0.8180               | 0.7575            | 0.8180   | 0.7667   | 0.8140      |
| 0.0048        | 28.0  | 18004 | 1.9104          | 0.8203   | 0.8196    | 0.8203 | 0.7877          | 0.7615       | 0.0160    | 0.0154       | 0.9758               | 0.9864            | 0.8203               | 0.7615            | 0.8203   | 0.7658   | 0.8175      |
| 0.0011        | 29.0  | 18647 | 1.9312          | 0.8203   | 0.8185    | 0.8203 | 0.7888          | 0.7600       | 0.0161    | 0.0154       | 0.9755               | 0.9864            | 0.8203               | 0.7600            | 0.8203   | 0.7664   | 0.8173      |
| 0.0004        | 30.0  | 19290 | 1.9234          | 0.8218   | 0.8189    | 0.8218 | 0.7836          | 0.7606       | 0.0159    | 0.0152       | 0.9756               | 0.9865            | 0.8218               | 0.7606            | 0.8218   | 0.7664   | 0.8189      |


### Framework versions

- Transformers 4.35.2
- Pytorch 2.1.0+cu121
- Datasets 2.19.0
- Tokenizers 0.15.1