HVD_Deberta_Large / README.md
aishanur's picture
aishanur/deberta_large_hv_no_upsampling
1e76a9d verified
|
raw
history blame
14 kB
---
license: mit
base_model: microsoft/deberta-v3-large
tags:
- generated_from_trainer
model-index:
- name: deberta_large_hv_no_upsampling
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# deberta_large_hv_no_upsampling
This model is a fine-tuned version of [microsoft/deberta-v3-large](https://huggingface.co./microsoft/deberta-v3-large) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0566
- F1-macro subtask 1: 0.2178
- F1-micro subtask 1: 0.3264
- Roc auc macro subtask 1: 0.5833
- F1-macro subtask 2: 0.0469
- F1-micro subtask 2: 0.0515
- Roc auc macro subtask 2: 0.6196
- Self-direction: thought1: 0.0
- Self-direction: action1: 0.1049
- Stimulation1: 0.1055
- Hedonism1: 0.2167
- Achievement1: 0.3279
- Power: dominance1: 0.3324
- Power: resources1: 0.1996
- Face1: 0.1995
- Security: personal1: 0.2380
- Security: societal1: 0.4408
- Tradition1: 0.3850
- Conformity: rules1: 0.4826
- Conformity: interpersonal1: 0.0
- Humility1: 0.0
- Benevolence: caring1: 0.1362
- Benevolence: dependability1: 0.0390
- Universalism: concern1: 0.3558
- Universalism: nature1: 0.5436
- Universalism: tolerance1: 0.0310
- Self-direction: thought attained2: 0.0192
- Self-direction: thought constrained2: 0.0086
- Self-direction: action attained2: 0.0609
- Self-direction: action constrained2: 0.0225
- Stimulation attained2: 0.0618
- Stimulation constrained2: 0.0130
- Hedonism attained2: 0.0152
- Hedonism constrained2: 0.0071
- Achievement attained2: 0.1285
- Achievement constrained2: 0.0769
- Power: dominance attained2: 0.0732
- Power: dominance constrained2: 0.0505
- Power: resources attained2: 0.0837
- Power: resources constrained2: 0.0743
- Face attained2: 0.0216
- Face constrained2: 0.0448
- Security: personal attained2: 0.0173
- Security: personal constrained2: 0.0441
- Security: societal attained2: 0.1013
- Security: societal constrained2: 0.1787
- Tradition attained2: 0.0399
- Tradition constrained2: 0.0085
- Conformity: rules attained2: 0.1024
- Conformity: rules constrained2: 0.0964
- Conformity: interpersonal attained2: 0.0146
- Conformity: interpersonal constrained2: 0.0302
- Humility attained2: 0.0050
- Humility constrained2: 0.0019
- Benevolence: caring attained2: 0.0500
- Benevolence: caring constrained2: 0.0101
- Benevolence: dependability attained2: 0.0407
- Benevolence: dependability constrained2: 0.0199
- Universalism: concern attained2: 0.0816
- Universalism: concern constrained2: 0.0610
- Universalism: nature attained2: 0.0473
- Universalism: nature constrained2: 0.0393
- Universalism: tolerance attained2: 0.0118
- Universalism: tolerance constrained2: 0.0195
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.2
- num_epochs: 4
### Training results
| Training Loss | Epoch | Step | Validation Loss | F1-macro subtask 1 | F1-micro subtask 1 | Roc auc macro subtask 1 | F1-macro subtask 2 | F1-micro subtask 2 | Roc auc macro subtask 2 | Self-direction: thought1 | Self-direction: action1 | Stimulation1 | Hedonism1 | Achievement1 | Power: dominance1 | Power: resources1 | Face1 | Security: personal1 | Security: societal1 | Tradition1 | Conformity: rules1 | Conformity: interpersonal1 | Humility1 | Benevolence: caring1 | Benevolence: dependability1 | Universalism: concern1 | Universalism: nature1 | Universalism: tolerance1 | Self-direction: thought attained2 | Self-direction: thought constrained2 | Self-direction: action attained2 | Self-direction: action constrained2 | Stimulation attained2 | Stimulation constrained2 | Hedonism attained2 | Hedonism constrained2 | Achievement attained2 | Achievement constrained2 | Power: dominance attained2 | Power: dominance constrained2 | Power: resources attained2 | Power: resources constrained2 | Face attained2 | Face constrained2 | Security: personal attained2 | Security: personal constrained2 | Security: societal attained2 | Security: societal constrained2 | Tradition attained2 | Tradition constrained2 | Conformity: rules attained2 | Conformity: rules constrained2 | Conformity: interpersonal attained2 | Conformity: interpersonal constrained2 | Humility attained2 | Humility constrained2 | Benevolence: caring attained2 | Benevolence: caring constrained2 | Benevolence: dependability attained2 | Benevolence: dependability constrained2 | Universalism: concern attained2 | Universalism: concern constrained2 | Universalism: nature attained2 | Universalism: nature constrained2 | Universalism: tolerance attained2 | Universalism: tolerance constrained2 |
|:-------------:|:-----:|:-----:|:---------------:|:------------------:|:------------------:|:-----------------------:|:------------------:|:------------------:|:-----------------------:|:------------------------:|:-----------------------:|:------------:|:---------:|:------------:|:-----------------:|:-----------------:|:------:|:-------------------:|:-------------------:|:----------:|:------------------:|:--------------------------:|:---------:|:--------------------:|:---------------------------:|:----------------------:|:---------------------:|:------------------------:|:---------------------------------:|:------------------------------------:|:--------------------------------:|:-----------------------------------:|:---------------------:|:------------------------:|:------------------:|:---------------------:|:---------------------:|:------------------------:|:--------------------------:|:-----------------------------:|:--------------------------:|:-----------------------------:|:--------------:|:-----------------:|:----------------------------:|:-------------------------------:|:----------------------------:|:-------------------------------:|:-------------------:|:----------------------:|:---------------------------:|:------------------------------:|:-----------------------------------:|:--------------------------------------:|:------------------:|:---------------------:|:-----------------------------:|:--------------------------------:|:------------------------------------:|:---------------------------------------:|:-------------------------------:|:----------------------------------:|:------------------------------:|:---------------------------------:|:---------------------------------:|:------------------------------------:|
| 0.0635 | 1.0 | 5595 | 0.0605 | 0.0659 | 0.1339 | 0.5221 | 0.0458 | 0.0501 | 0.5899 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0812 | 0.3342 | 0.1051 | 0.0 | 0.0072 | 0.2917 | 0.0 | 0.0919 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0030 | 0.3383 | 0.0 | 0.0190 | 0.0 | 0.0512 | 0.0 | 0.0525 | 0.0083 | 0.0146 | 0.0068 | 0.1117 | 0.1019 | 0.0704 | 0.0095 | 0.0782 | 0.0900 | 0.0185 | 0.1068 | 0.0173 | 0.0453 | 0.1012 | 0.1626 | 0.0351 | 0.0034 | 0.0967 | 0.1016 | 0.0161 | 0.0300 | 0.0044 | 0.0 | 0.0444 | 0.0173 | 0.0403 | 0.0159 | 0.0742 | 0.0693 | 0.0412 | 0.0551 | 0.0158 | 0.0131 |
| 0.0579 | 2.0 | 11190 | 0.0566 | 0.2178 | 0.3264 | 0.5833 | 0.0469 | 0.0515 | 0.6196 | 0.0 | 0.1049 | 0.1055 | 0.2167 | 0.3279 | 0.3324 | 0.1996 | 0.1995 | 0.2380 | 0.4408 | 0.3850 | 0.4826 | 0.0 | 0.0 | 0.1362 | 0.0390 | 0.3558 | 0.5436 | 0.0310 | 0.0192 | 0.0086 | 0.0609 | 0.0225 | 0.0618 | 0.0130 | 0.0152 | 0.0071 | 0.1285 | 0.0769 | 0.0732 | 0.0505 | 0.0837 | 0.0743 | 0.0216 | 0.0448 | 0.0173 | 0.0441 | 0.1013 | 0.1787 | 0.0399 | 0.0085 | 0.1024 | 0.0964 | 0.0146 | 0.0302 | 0.0050 | 0.0019 | 0.0500 | 0.0101 | 0.0407 | 0.0199 | 0.0816 | 0.0610 | 0.0473 | 0.0393 | 0.0118 | 0.0195 |
| 0.0494 | 3.0 | 16785 | 0.0593 | 0.2890 | 0.3592 | 0.6203 | 0.0477 | 0.0517 | 0.6389 | 0.0326 | 0.2200 | 0.3066 | 0.3494 | 0.3684 | 0.3375 | 0.2974 | 0.2748 | 0.3153 | 0.4421 | 0.4372 | 0.4851 | 0.0097 | 0.0 | 0.2210 | 0.2649 | 0.3284 | 0.5909 | 0.2099 | 0.0210 | 0.0147 | 0.0642 | 0.0226 | 0.0684 | 0.0111 | 0.0150 | 0.0069 | 0.1390 | 0.0675 | 0.0769 | 0.0533 | 0.0862 | 0.0681 | 0.0234 | 0.0406 | 0.0225 | 0.0387 | 0.1132 | 0.1578 | 0.0461 | 0.0105 | 0.1287 | 0.0736 | 0.0176 | 0.0290 | 0.0067 | 0.0017 | 0.0550 | 0.0100 | 0.0440 | 0.0168 | 0.0898 | 0.0538 | 0.0513 | 0.0369 | 0.0127 | 0.0168 |
| 0.028 | 4.0 | 22380 | 0.0640 | 0.2936 | 0.3626 | 0.6191 | 0.0479 | 0.0516 | 0.6370 | 0.0609 | 0.2275 | 0.3042 | 0.3444 | 0.3601 | 0.3445 | 0.3163 | 0.2599 | 0.3326 | 0.4429 | 0.4259 | 0.4933 | 0.075 | 0.0 | 0.2313 | 0.2688 | 0.3698 | 0.5570 | 0.1639 | 0.0202 | 0.0213 | 0.0640 | 0.0253 | 0.0699 | 0.0114 | 0.0151 | 0.0070 | 0.1318 | 0.0736 | 0.0776 | 0.0491 | 0.0847 | 0.0708 | 0.0234 | 0.0426 | 0.0210 | 0.0400 | 0.1080 | 0.1655 | 0.0419 | 0.0115 | 0.1199 | 0.0767 | 0.0161 | 0.0337 | 0.0069 | 0.0016 | 0.0564 | 0.0105 | 0.0407 | 0.0187 | 0.0811 | 0.0635 | 0.0459 | 0.0429 | 0.0131 | 0.0158 |
### Framework versions
- Transformers 4.41.2
- Pytorch 2.3.0
- Datasets 2.19.0
- Tokenizers 0.19.1