aishanur commited on
Commit
9764aed
1 Parent(s): 8641f11

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -91
README.md CHANGED
@@ -11,88 +11,9 @@ model-index:
11
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
12
  should probably proofread and complete it, then remove this comment. -->
13
 
14
- # deberta_large_hv_no_upsampling
15
 
16
- This model is a fine-tuned version of [microsoft/deberta-v3-large](https://huggingface.co/microsoft/deberta-v3-large) on the None dataset.
17
- It achieves the following results on the evaluation set:
18
- - Loss: 0.0563
19
- - F1-macro subtask 1: 0.2338
20
- - F1-micro subtask 1: 0.3429
21
- - Roc auc macro subtask 1: 0.5913
22
- - F1-macro subtask 2: 0.0471
23
- - F1-micro subtask 2: 0.0516
24
- - Roc auc macro subtask 2: 0.6153
25
- - Self-direction: thought1: 0.0
26
- - Self-direction: action1: 0.1688
27
- - Stimulation1: 0.1392
28
- - Hedonism1: 0.0377
29
- - Achievement1: 0.3829
30
- - Power: dominance1: 0.3254
31
- - Power: resources1: 0.2112
32
- - Face1: 0.2377
33
- - Security: personal1: 0.2786
34
- - Security: societal1: 0.4429
35
- - Tradition1: 0.4493
36
- - Conformity: rules1: 0.4815
37
- - Conformity: interpersonal1: 0.0
38
- - Humility1: 0.0
39
- - Benevolence: caring1: 0.1292
40
- - Benevolence: dependability1: 0.1145
41
- - Universalism: concern1: 0.3467
42
- - Universalism: nature1: 0.5707
43
- - Universalism: tolerance1: 0.1259
44
- - Self-direction: thought attained2: 0.0194
45
- - Self-direction: thought constrained2: 0.0093
46
- - Self-direction: action attained2: 0.0598
47
- - Self-direction: action constrained2: 0.0261
48
- - Stimulation attained2: 0.0632
49
- - Stimulation constrained2: 0.0129
50
- - Hedonism attained2: 0.0157
51
- - Hedonism constrained2: 0.0053
52
- - Achievement attained2: 0.1331
53
- - Achievement constrained2: 0.0743
54
- - Power: dominance attained2: 0.0744
55
- - Power: dominance constrained2: 0.0529
56
- - Power: resources attained2: 0.0934
57
- - Power: resources constrained2: 0.0581
58
- - Face attained2: 0.0192
59
- - Face constrained2: 0.0457
60
- - Security: personal attained2: 0.0214
61
- - Security: personal constrained2: 0.0376
62
- - Security: societal attained2: 0.1016
63
- - Security: societal constrained2: 0.1709
64
- - Tradition attained2: 0.0395
65
- - Tradition constrained2: 0.0111
66
- - Conformity: rules attained2: 0.0982
67
- - Conformity: rules constrained2: 0.1113
68
- - Conformity: interpersonal attained2: 0.0144
69
- - Conformity: interpersonal constrained2: 0.0328
70
- - Humility attained2: 0.0048
71
- - Humility constrained2: 0.0014
72
- - Benevolence: caring attained2: 0.0515
73
- - Benevolence: caring constrained2: 0.0091
74
- - Benevolence: dependability attained2: 0.0412
75
- - Benevolence: dependability constrained2: 0.0181
76
- - Universalism: concern attained2: 0.0800
77
- - Universalism: concern constrained2: 0.0655
78
- - Universalism: nature attained2: 0.0423
79
- - Universalism: nature constrained2: 0.0462
80
- - Universalism: tolerance attained2: 0.0125
81
- - Universalism: tolerance constrained2: 0.0166
82
-
83
- ## Model description
84
-
85
- More information needed
86
-
87
- ## Intended uses & limitations
88
-
89
- More information needed
90
-
91
- ## Training and evaluation data
92
-
93
- More information needed
94
-
95
- ## Training procedure
96
 
97
  ### Training hyperparameters
98
 
@@ -106,16 +27,6 @@ The following hyperparameters were used during training:
106
  - lr_scheduler_warmup_ratio: 0.2
107
  - num_epochs: 4
108
 
109
- ### Training results
110
-
111
- | Training Loss | Epoch | Step | Validation Loss | F1-macro subtask 1 | F1-micro subtask 1 | Roc auc macro subtask 1 | F1-macro subtask 2 | F1-micro subtask 2 | Roc auc macro subtask 2 | Self-direction: thought1 | Self-direction: action1 | Stimulation1 | Hedonism1 | Achievement1 | Power: dominance1 | Power: resources1 | Face1 | Security: personal1 | Security: societal1 | Tradition1 | Conformity: rules1 | Conformity: interpersonal1 | Humility1 | Benevolence: caring1 | Benevolence: dependability1 | Universalism: concern1 | Universalism: nature1 | Universalism: tolerance1 | Self-direction: thought attained2 | Self-direction: thought constrained2 | Self-direction: action attained2 | Self-direction: action constrained2 | Stimulation attained2 | Stimulation constrained2 | Hedonism attained2 | Hedonism constrained2 | Achievement attained2 | Achievement constrained2 | Power: dominance attained2 | Power: dominance constrained2 | Power: resources attained2 | Power: resources constrained2 | Face attained2 | Face constrained2 | Security: personal attained2 | Security: personal constrained2 | Security: societal attained2 | Security: societal constrained2 | Tradition attained2 | Tradition constrained2 | Conformity: rules attained2 | Conformity: rules constrained2 | Conformity: interpersonal attained2 | Conformity: interpersonal constrained2 | Humility attained2 | Humility constrained2 | Benevolence: caring attained2 | Benevolence: caring constrained2 | Benevolence: dependability attained2 | Benevolence: dependability constrained2 | Universalism: concern attained2 | Universalism: concern constrained2 | Universalism: nature attained2 | Universalism: nature constrained2 | Universalism: tolerance attained2 | Universalism: tolerance constrained2 |
112
- |:-------------:|:-----:|:-----:|:---------------:|:------------------:|:------------------:|:-----------------------:|:------------------:|:------------------:|:-----------------------:|:------------------------:|:-----------------------:|:------------:|:---------:|:------------:|:-----------------:|:-----------------:|:------:|:-------------------:|:-------------------:|:----------:|:------------------:|:--------------------------:|:---------:|:--------------------:|:---------------------------:|:----------------------:|:---------------------:|:------------------------:|:---------------------------------:|:------------------------------------:|:--------------------------------:|:-----------------------------------:|:---------------------:|:------------------------:|:------------------:|:---------------------:|:---------------------:|:------------------------:|:--------------------------:|:-----------------------------:|:--------------------------:|:-----------------------------:|:--------------:|:-----------------:|:----------------------------:|:-------------------------------:|:----------------------------:|:-------------------------------:|:-------------------:|:----------------------:|:---------------------------:|:------------------------------:|:-----------------------------------:|:--------------------------------------:|:------------------:|:---------------------:|:-----------------------------:|:--------------------------------:|:------------------------------------:|:---------------------------------------:|:-------------------------------:|:----------------------------------:|:------------------------------:|:---------------------------------:|:---------------------------------:|:------------------------------------:|
113
- | 0.0611 | 1.0 | 5595 | 0.0622 | 0.0594 | 0.1531 | 0.5208 | 0.0431 | 0.0490 | 0.5798 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0759 | 0.3279 | 0.1443 | 0.0 | 0.0 | 0.3920 | 0.0 | 0.0967 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0321 | 0.0600 | 0.0 | 0.0193 | 0.0092 | 0.0510 | 0.0327 | 0.0521 | 0.0235 | 0.0157 | 0.0057 | 0.1158 | 0.0812 | 0.0702 | 0.0 | 0.0795 | 0.0813 | 0.0184 | 0.0639 | 0.0206 | 0.0366 | 0.1063 | 0.1471 | 0.0339 | 0.0054 | 0.0936 | 0.0855 | 0.0141 | 0.0313 | 0.0046 | 0.0 | 0.0444 | 0.0079 | 0.0410 | 0.0157 | 0.0763 | 0.0504 | 0.0532 | 0.0229 | 0.0174 | 0.0094 |
114
- | 0.0561 | 2.0 | 11190 | 0.0563 | 0.2338 | 0.3429 | 0.5913 | 0.0471 | 0.0516 | 0.6153 | 0.0 | 0.1688 | 0.1392 | 0.0377 | 0.3829 | 0.3254 | 0.2112 | 0.2377 | 0.2786 | 0.4429 | 0.4493 | 0.4815 | 0.0 | 0.0 | 0.1292 | 0.1145 | 0.3467 | 0.5707 | 0.1259 | 0.0194 | 0.0093 | 0.0598 | 0.0261 | 0.0632 | 0.0129 | 0.0157 | 0.0053 | 0.1331 | 0.0743 | 0.0744 | 0.0529 | 0.0934 | 0.0581 | 0.0192 | 0.0457 | 0.0214 | 0.0376 | 0.1016 | 0.1709 | 0.0395 | 0.0111 | 0.0982 | 0.1113 | 0.0144 | 0.0328 | 0.0048 | 0.0014 | 0.0515 | 0.0091 | 0.0412 | 0.0181 | 0.0800 | 0.0655 | 0.0423 | 0.0462 | 0.0125 | 0.0166 |
115
- | 0.0472 | 3.0 | 16785 | 0.0593 | 0.2994 | 0.3643 | 0.6255 | 0.0480 | 0.0518 | 0.6384 | 0.0718 | 0.2131 | 0.3010 | 0.3536 | 0.3610 | 0.3566 | 0.3286 | 0.2765 | 0.3026 | 0.4493 | 0.4686 | 0.4622 | 0.0365 | 0.0 | 0.2830 | 0.2562 | 0.3463 | 0.5799 | 0.2411 | 0.0219 | 0.0103 | 0.0627 | 0.0279 | 0.0706 | 0.0123 | 0.0162 | 0.0060 | 0.1470 | 0.0636 | 0.0777 | 0.0517 | 0.0929 | 0.0580 | 0.0206 | 0.0437 | 0.0225 | 0.0392 | 0.1222 | 0.1497 | 0.0453 | 0.0122 | 0.1321 | 0.0743 | 0.0191 | 0.0270 | 0.0059 | 0.0011 | 0.0538 | 0.0116 | 0.0471 | 0.0139 | 0.0932 | 0.0533 | 0.0494 | 0.0392 | 0.0152 | 0.0138 |
116
- | 0.0264 | 4.0 | 22380 | 0.0644 | 0.3041 | 0.3684 | 0.6250 | 0.0480 | 0.0517 | 0.6360 | 0.0857 | 0.2239 | 0.2996 | 0.3247 | 0.3849 | 0.3520 | 0.3012 | 0.2611 | 0.3260 | 0.4496 | 0.4406 | 0.4809 | 0.1264 | 0.0 | 0.2444 | 0.2495 | 0.3978 | 0.5742 | 0.2562 | 0.0201 | 0.0217 | 0.0631 | 0.0277 | 0.0695 | 0.0119 | 0.0157 | 0.0070 | 0.1352 | 0.0716 | 0.0778 | 0.0551 | 0.0888 | 0.0619 | 0.0210 | 0.0502 | 0.0210 | 0.0388 | 0.1130 | 0.1590 | 0.0413 | 0.0149 | 0.1187 | 0.0761 | 0.0161 | 0.0342 | 0.0054 | 0.0012 | 0.0531 | 0.0128 | 0.0413 | 0.0165 | 0.0823 | 0.0644 | 0.0466 | 0.0408 | 0.0153 | 0.0137 |
117
-
118
-
119
  ### Framework versions
120
 
121
  - Transformers 4.41.2
 
11
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
12
  should probably proofread and complete it, then remove this comment. -->
13
 
14
+ # Human Value Detection Deberta Large
15
 
16
+ This model is a fine-tuned version of [microsoft/deberta-v3-large](https://huggingface.co/microsoft/deberta-v3-large) on the training data of the ValueML dataset.
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
17
 
18
  ### Training hyperparameters
19
 
 
27
  - lr_scheduler_warmup_ratio: 0.2
28
  - num_epochs: 4
29
 
 
 
 
 
 
 
 
 
 
 
30
  ### Framework versions
31
 
32
  - Transformers 4.41.2