|
--- |
|
license: mit |
|
base_model: microsoft/deberta-v3-large |
|
tags: |
|
- generated_from_trainer |
|
model-index: |
|
- name: deberta_large_hv_no_upsampling |
|
results: [] |
|
--- |
|
|
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You |
|
should probably proofread and complete it, then remove this comment. --> |
|
|
|
# deberta_large_hv_no_upsampling |
|
|
|
This model is a fine-tuned version of [microsoft/deberta-v3-large](https://huggingface.co./microsoft/deberta-v3-large) on the None dataset. |
|
It achieves the following results on the evaluation set: |
|
- Loss: 0.0563 |
|
- F1-macro subtask 1: 0.2338 |
|
- F1-micro subtask 1: 0.3429 |
|
- Roc auc macro subtask 1: 0.5913 |
|
- F1-macro subtask 2: 0.0471 |
|
- F1-micro subtask 2: 0.0516 |
|
- Roc auc macro subtask 2: 0.6153 |
|
- Self-direction: thought1: 0.0 |
|
- Self-direction: action1: 0.1688 |
|
- Stimulation1: 0.1392 |
|
- Hedonism1: 0.0377 |
|
- Achievement1: 0.3829 |
|
- Power: dominance1: 0.3254 |
|
- Power: resources1: 0.2112 |
|
- Face1: 0.2377 |
|
- Security: personal1: 0.2786 |
|
- Security: societal1: 0.4429 |
|
- Tradition1: 0.4493 |
|
- Conformity: rules1: 0.4815 |
|
- Conformity: interpersonal1: 0.0 |
|
- Humility1: 0.0 |
|
- Benevolence: caring1: 0.1292 |
|
- Benevolence: dependability1: 0.1145 |
|
- Universalism: concern1: 0.3467 |
|
- Universalism: nature1: 0.5707 |
|
- Universalism: tolerance1: 0.1259 |
|
- Self-direction: thought attained2: 0.0194 |
|
- Self-direction: thought constrained2: 0.0093 |
|
- Self-direction: action attained2: 0.0598 |
|
- Self-direction: action constrained2: 0.0261 |
|
- Stimulation attained2: 0.0632 |
|
- Stimulation constrained2: 0.0129 |
|
- Hedonism attained2: 0.0157 |
|
- Hedonism constrained2: 0.0053 |
|
- Achievement attained2: 0.1331 |
|
- Achievement constrained2: 0.0743 |
|
- Power: dominance attained2: 0.0744 |
|
- Power: dominance constrained2: 0.0529 |
|
- Power: resources attained2: 0.0934 |
|
- Power: resources constrained2: 0.0581 |
|
- Face attained2: 0.0192 |
|
- Face constrained2: 0.0457 |
|
- Security: personal attained2: 0.0214 |
|
- Security: personal constrained2: 0.0376 |
|
- Security: societal attained2: 0.1016 |
|
- Security: societal constrained2: 0.1709 |
|
- Tradition attained2: 0.0395 |
|
- Tradition constrained2: 0.0111 |
|
- Conformity: rules attained2: 0.0982 |
|
- Conformity: rules constrained2: 0.1113 |
|
- Conformity: interpersonal attained2: 0.0144 |
|
- Conformity: interpersonal constrained2: 0.0328 |
|
- Humility attained2: 0.0048 |
|
- Humility constrained2: 0.0014 |
|
- Benevolence: caring attained2: 0.0515 |
|
- Benevolence: caring constrained2: 0.0091 |
|
- Benevolence: dependability attained2: 0.0412 |
|
- Benevolence: dependability constrained2: 0.0181 |
|
- Universalism: concern attained2: 0.0800 |
|
- Universalism: concern constrained2: 0.0655 |
|
- Universalism: nature attained2: 0.0423 |
|
- Universalism: nature constrained2: 0.0462 |
|
- Universalism: tolerance attained2: 0.0125 |
|
- Universalism: tolerance constrained2: 0.0166 |
|
|
|
## Model description |
|
|
|
More information needed |
|
|
|
## Intended uses & limitations |
|
|
|
More information needed |
|
|
|
## Training and evaluation data |
|
|
|
More information needed |
|
|
|
## Training procedure |
|
|
|
### Training hyperparameters |
|
|
|
The following hyperparameters were used during training: |
|
- learning_rate: 2e-05 |
|
- train_batch_size: 8 |
|
- eval_batch_size: 8 |
|
- seed: 42 |
|
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 |
|
- lr_scheduler_type: linear |
|
- lr_scheduler_warmup_ratio: 0.2 |
|
- num_epochs: 4 |
|
|
|
### Training results |
|
|
|
| Training Loss | Epoch | Step | Validation Loss | F1-macro subtask 1 | F1-micro subtask 1 | Roc auc macro subtask 1 | F1-macro subtask 2 | F1-micro subtask 2 | Roc auc macro subtask 2 | Self-direction: thought1 | Self-direction: action1 | Stimulation1 | Hedonism1 | Achievement1 | Power: dominance1 | Power: resources1 | Face1 | Security: personal1 | Security: societal1 | Tradition1 | Conformity: rules1 | Conformity: interpersonal1 | Humility1 | Benevolence: caring1 | Benevolence: dependability1 | Universalism: concern1 | Universalism: nature1 | Universalism: tolerance1 | Self-direction: thought attained2 | Self-direction: thought constrained2 | Self-direction: action attained2 | Self-direction: action constrained2 | Stimulation attained2 | Stimulation constrained2 | Hedonism attained2 | Hedonism constrained2 | Achievement attained2 | Achievement constrained2 | Power: dominance attained2 | Power: dominance constrained2 | Power: resources attained2 | Power: resources constrained2 | Face attained2 | Face constrained2 | Security: personal attained2 | Security: personal constrained2 | Security: societal attained2 | Security: societal constrained2 | Tradition attained2 | Tradition constrained2 | Conformity: rules attained2 | Conformity: rules constrained2 | Conformity: interpersonal attained2 | Conformity: interpersonal constrained2 | Humility attained2 | Humility constrained2 | Benevolence: caring attained2 | Benevolence: caring constrained2 | Benevolence: dependability attained2 | Benevolence: dependability constrained2 | Universalism: concern attained2 | Universalism: concern constrained2 | Universalism: nature attained2 | Universalism: nature constrained2 | Universalism: tolerance attained2 | Universalism: tolerance constrained2 | |
|
|:-------------:|:-----:|:-----:|:---------------:|:------------------:|:------------------:|:-----------------------:|:------------------:|:------------------:|:-----------------------:|:------------------------:|:-----------------------:|:------------:|:---------:|:------------:|:-----------------:|:-----------------:|:------:|:-------------------:|:-------------------:|:----------:|:------------------:|:--------------------------:|:---------:|:--------------------:|:---------------------------:|:----------------------:|:---------------------:|:------------------------:|:---------------------------------:|:------------------------------------:|:--------------------------------:|:-----------------------------------:|:---------------------:|:------------------------:|:------------------:|:---------------------:|:---------------------:|:------------------------:|:--------------------------:|:-----------------------------:|:--------------------------:|:-----------------------------:|:--------------:|:-----------------:|:----------------------------:|:-------------------------------:|:----------------------------:|:-------------------------------:|:-------------------:|:----------------------:|:---------------------------:|:------------------------------:|:-----------------------------------:|:--------------------------------------:|:------------------:|:---------------------:|:-----------------------------:|:--------------------------------:|:------------------------------------:|:---------------------------------------:|:-------------------------------:|:----------------------------------:|:------------------------------:|:---------------------------------:|:---------------------------------:|:------------------------------------:| |
|
| 0.0611 | 1.0 | 5595 | 0.0622 | 0.0594 | 0.1531 | 0.5208 | 0.0431 | 0.0490 | 0.5798 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0759 | 0.3279 | 0.1443 | 0.0 | 0.0 | 0.3920 | 0.0 | 0.0967 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0321 | 0.0600 | 0.0 | 0.0193 | 0.0092 | 0.0510 | 0.0327 | 0.0521 | 0.0235 | 0.0157 | 0.0057 | 0.1158 | 0.0812 | 0.0702 | 0.0 | 0.0795 | 0.0813 | 0.0184 | 0.0639 | 0.0206 | 0.0366 | 0.1063 | 0.1471 | 0.0339 | 0.0054 | 0.0936 | 0.0855 | 0.0141 | 0.0313 | 0.0046 | 0.0 | 0.0444 | 0.0079 | 0.0410 | 0.0157 | 0.0763 | 0.0504 | 0.0532 | 0.0229 | 0.0174 | 0.0094 | |
|
| 0.0561 | 2.0 | 11190 | 0.0563 | 0.2338 | 0.3429 | 0.5913 | 0.0471 | 0.0516 | 0.6153 | 0.0 | 0.1688 | 0.1392 | 0.0377 | 0.3829 | 0.3254 | 0.2112 | 0.2377 | 0.2786 | 0.4429 | 0.4493 | 0.4815 | 0.0 | 0.0 | 0.1292 | 0.1145 | 0.3467 | 0.5707 | 0.1259 | 0.0194 | 0.0093 | 0.0598 | 0.0261 | 0.0632 | 0.0129 | 0.0157 | 0.0053 | 0.1331 | 0.0743 | 0.0744 | 0.0529 | 0.0934 | 0.0581 | 0.0192 | 0.0457 | 0.0214 | 0.0376 | 0.1016 | 0.1709 | 0.0395 | 0.0111 | 0.0982 | 0.1113 | 0.0144 | 0.0328 | 0.0048 | 0.0014 | 0.0515 | 0.0091 | 0.0412 | 0.0181 | 0.0800 | 0.0655 | 0.0423 | 0.0462 | 0.0125 | 0.0166 | |
|
| 0.0472 | 3.0 | 16785 | 0.0593 | 0.2994 | 0.3643 | 0.6255 | 0.0480 | 0.0518 | 0.6384 | 0.0718 | 0.2131 | 0.3010 | 0.3536 | 0.3610 | 0.3566 | 0.3286 | 0.2765 | 0.3026 | 0.4493 | 0.4686 | 0.4622 | 0.0365 | 0.0 | 0.2830 | 0.2562 | 0.3463 | 0.5799 | 0.2411 | 0.0219 | 0.0103 | 0.0627 | 0.0279 | 0.0706 | 0.0123 | 0.0162 | 0.0060 | 0.1470 | 0.0636 | 0.0777 | 0.0517 | 0.0929 | 0.0580 | 0.0206 | 0.0437 | 0.0225 | 0.0392 | 0.1222 | 0.1497 | 0.0453 | 0.0122 | 0.1321 | 0.0743 | 0.0191 | 0.0270 | 0.0059 | 0.0011 | 0.0538 | 0.0116 | 0.0471 | 0.0139 | 0.0932 | 0.0533 | 0.0494 | 0.0392 | 0.0152 | 0.0138 | |
|
| 0.0264 | 4.0 | 22380 | 0.0644 | 0.3041 | 0.3684 | 0.6250 | 0.0480 | 0.0517 | 0.6360 | 0.0857 | 0.2239 | 0.2996 | 0.3247 | 0.3849 | 0.3520 | 0.3012 | 0.2611 | 0.3260 | 0.4496 | 0.4406 | 0.4809 | 0.1264 | 0.0 | 0.2444 | 0.2495 | 0.3978 | 0.5742 | 0.2562 | 0.0201 | 0.0217 | 0.0631 | 0.0277 | 0.0695 | 0.0119 | 0.0157 | 0.0070 | 0.1352 | 0.0716 | 0.0778 | 0.0551 | 0.0888 | 0.0619 | 0.0210 | 0.0502 | 0.0210 | 0.0388 | 0.1130 | 0.1590 | 0.0413 | 0.0149 | 0.1187 | 0.0761 | 0.0161 | 0.0342 | 0.0054 | 0.0012 | 0.0531 | 0.0128 | 0.0413 | 0.0165 | 0.0823 | 0.0644 | 0.0466 | 0.0408 | 0.0153 | 0.0137 | |
|
|
|
|
|
### Framework versions |
|
|
|
- Transformers 4.41.2 |
|
- Pytorch 2.3.0 |
|
- Datasets 2.19.0 |
|
- Tokenizers 0.19.1 |
|
|