Training in progress, step 200
Browse files- README.md +22 -23
- logs/events.out.tfevents.1719324686.Allianz-Editique.1964.0 +2 -2
- logs/events.out.tfevents.1719327221.Allianz-Editique.2290.0 +3 -0
- logs/events.out.tfevents.1719343104.Allianz-Editique.3019.0 +3 -0
- logs/events.out.tfevents.1719343147.Allianz-Editique.3155.0 +3 -0
- logs/events.out.tfevents.1719343178.Allianz-Editique.3263.0 +3 -0
- logs/events.out.tfevents.1719343262.Allianz-Editique.3404.0 +3 -0
- logs/events.out.tfevents.1719343334.Allianz-Editique.3488.0 +3 -0
- logs/events.out.tfevents.1719343371.Allianz-Editique.3545.0 +3 -0
- logs/events.out.tfevents.1719343400.Allianz-Editique.3601.0 +3 -0
- model.safetensors +1 -1
- training_args.bin +1 -1
README.md
CHANGED
@@ -15,14 +15,14 @@ should probably proofread and complete it, then remove this comment. -->
|
|
15 |
|
16 |
This model is a fine-tuned version of [SCUT-DLVCLab/lilt-roberta-en-base](https://huggingface.co/SCUT-DLVCLab/lilt-roberta-en-base) on an unknown dataset.
|
17 |
It achieves the following results on the evaluation set:
|
18 |
-
- Loss: 1.
|
19 |
-
- Answer: {'precision': 0.
|
20 |
-
- Header: {'precision': 0.
|
21 |
-
- Question: {'precision': 0.
|
22 |
-
- Overall Precision: 0.
|
23 |
-
- Overall Recall: 0.
|
24 |
-
- Overall F1: 0.
|
25 |
-
- Overall Accuracy: 0.
|
26 |
|
27 |
## Model description
|
28 |
|
@@ -45,7 +45,6 @@ The following hyperparameters were used during training:
|
|
45 |
- train_batch_size: 8
|
46 |
- eval_batch_size: 8
|
47 |
- seed: 42
|
48 |
-
- distributed_type: multi-GPU
|
49 |
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
|
50 |
- lr_scheduler_type: linear
|
51 |
- training_steps: 2500
|
@@ -53,20 +52,20 @@ The following hyperparameters were used during training:
|
|
53 |
|
54 |
### Training results
|
55 |
|
56 |
-
| Training Loss | Epoch | Step | Validation Loss | Answer | Header
|
57 |
-
|
58 |
-
| 0.
|
59 |
-
| 0.
|
60 |
-
| 0.
|
61 |
-
| 0.
|
62 |
-
| 0.
|
63 |
-
| 0.
|
64 |
-
| 0.
|
65 |
-
| 0.
|
66 |
-
| 0.
|
67 |
-
| 0.
|
68 |
-
| 0.0003 | 115.7895 | 2200 | 1.
|
69 |
-
| 0.0002 | 126.3158 | 2400 | 1.
|
70 |
|
71 |
|
72 |
### Framework versions
|
|
|
15 |
|
16 |
This model is a fine-tuned version of [SCUT-DLVCLab/lilt-roberta-en-base](https://huggingface.co/SCUT-DLVCLab/lilt-roberta-en-base) on an unknown dataset.
|
17 |
It achieves the following results on the evaluation set:
|
18 |
+
- Loss: 1.6323
|
19 |
+
- Answer: {'precision': 0.8683901292596945, 'recall': 0.9045287637698899, 'f1': 0.8860911270983215, 'number': 817}
|
20 |
+
- Header: {'precision': 0.6095238095238096, 'recall': 0.5378151260504201, 'f1': 0.5714285714285715, 'number': 119}
|
21 |
+
- Question: {'precision': 0.90063233965673, 'recall': 0.9257195914577531, 'f1': 0.913003663003663, 'number': 1077}
|
22 |
+
- Overall Precision: 0.8725
|
23 |
+
- Overall Recall: 0.8942
|
24 |
+
- Overall F1: 0.8832
|
25 |
+
- Overall Accuracy: 0.8071
|
26 |
|
27 |
## Model description
|
28 |
|
|
|
45 |
- train_batch_size: 8
|
46 |
- eval_batch_size: 8
|
47 |
- seed: 42
|
|
|
48 |
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
|
49 |
- lr_scheduler_type: linear
|
50 |
- training_steps: 2500
|
|
|
52 |
|
53 |
### Training results
|
54 |
|
55 |
+
| Training Loss | Epoch | Step | Validation Loss | Answer | Header | Question | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy |
|
56 |
+
|:-------------:|:--------:|:----:|:---------------:|:--------------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------------------------------------------:|:---------------------------------------------------------------------------------------------------------:|:-----------------:|:--------------:|:----------:|:----------------:|
|
57 |
+
| 0.4052 | 10.5263 | 200 | 1.0646 | {'precision': 0.8046709129511678, 'recall': 0.9277845777233782, 'f1': 0.861853325753269, 'number': 817} | {'precision': 0.5803571428571429, 'recall': 0.5462184873949579, 'f1': 0.5627705627705628, 'number': 119} | {'precision': 0.8797061524334252, 'recall': 0.8895078922934077, 'f1': 0.8845798707294553, 'number': 1077} | 0.8311 | 0.8847 | 0.8571 | 0.7850 |
|
58 |
+
| 0.0474 | 21.0526 | 400 | 1.2300 | {'precision': 0.8500590318772137, 'recall': 0.8812729498164015, 'f1': 0.8653846153846153, 'number': 817} | {'precision': 0.5454545454545454, 'recall': 0.6554621848739496, 'f1': 0.5954198473282444, 'number': 119} | {'precision': 0.8798908098271155, 'recall': 0.8978644382544104, 'f1': 0.8887867647058824, 'number': 1077} | 0.8449 | 0.8768 | 0.8606 | 0.8026 |
|
59 |
+
| 0.0127 | 31.5789 | 600 | 1.5767 | {'precision': 0.8359728506787331, 'recall': 0.9045287637698899, 'f1': 0.8689006466784246, 'number': 817} | {'precision': 0.5583333333333333, 'recall': 0.5630252100840336, 'f1': 0.5606694560669456, 'number': 119} | {'precision': 0.8940397350993378, 'recall': 0.8774373259052924, 'f1': 0.8856607310215557, 'number': 1077} | 0.8496 | 0.8698 | 0.8596 | 0.7835 |
|
60 |
+
| 0.0085 | 42.1053 | 800 | 1.3875 | {'precision': 0.833710407239819, 'recall': 0.9020807833537332, 'f1': 0.8665490887713109, 'number': 817} | {'precision': 0.6363636363636364, 'recall': 0.5294117647058824, 'f1': 0.5779816513761468, 'number': 119} | {'precision': 0.8825654923215899, 'recall': 0.9071494893221913, 'f1': 0.8946886446886446, 'number': 1077} | 0.8502 | 0.8828 | 0.8662 | 0.8072 |
|
61 |
+
| 0.0058 | 52.6316 | 1000 | 1.4794 | {'precision': 0.8272017837235228, 'recall': 0.9082007343941249, 'f1': 0.8658109684947491, 'number': 817} | {'precision': 0.5203252032520326, 'recall': 0.5378151260504201, 'f1': 0.5289256198347108, 'number': 119} | {'precision': 0.8829981718464351, 'recall': 0.8969359331476323, 'f1': 0.889912482726854, 'number': 1077} | 0.8382 | 0.8803 | 0.8587 | 0.7964 |
|
62 |
+
| 0.0038 | 63.1579 | 1200 | 1.5286 | {'precision': 0.8443935926773455, 'recall': 0.9033047735618115, 'f1': 0.872856298048492, 'number': 817} | {'precision': 0.625, 'recall': 0.5042016806722689, 'f1': 0.5581395348837209, 'number': 119} | {'precision': 0.8991674375578168, 'recall': 0.9025069637883009, 'f1': 0.9008341056533827, 'number': 1077} | 0.8630 | 0.8793 | 0.8711 | 0.8084 |
|
63 |
+
| 0.0023 | 73.6842 | 1400 | 1.6443 | {'precision': 0.8725146198830409, 'recall': 0.9130966952264382, 'f1': 0.8923444976076554, 'number': 817} | {'precision': 0.5, 'recall': 0.5210084033613446, 'f1': 0.5102880658436215, 'number': 119} | {'precision': 0.8906955736224029, 'recall': 0.9155060352831941, 'f1': 0.9029304029304029, 'number': 1077} | 0.8600 | 0.8912 | 0.8753 | 0.8054 |
|
64 |
+
| 0.0012 | 84.2105 | 1600 | 1.6379 | {'precision': 0.8404977375565611, 'recall': 0.9094247246022031, 'f1': 0.8736037624926513, 'number': 817} | {'precision': 0.6224489795918368, 'recall': 0.5126050420168067, 'f1': 0.5622119815668203, 'number': 119} | {'precision': 0.8944444444444445, 'recall': 0.8969359331476323, 'f1': 0.8956884561891516, 'number': 1077} | 0.8584 | 0.8793 | 0.8687 | 0.8008 |
|
65 |
+
| 0.0005 | 94.7368 | 1800 | 1.6798 | {'precision': 0.8450057405281286, 'recall': 0.9008567931456548, 'f1': 0.8720379146919431, 'number': 817} | {'precision': 0.6534653465346535, 'recall': 0.5546218487394958, 'f1': 0.6000000000000001, 'number': 119} | {'precision': 0.8886861313868614, 'recall': 0.904363974001857, 'f1': 0.8964565117349288, 'number': 1077} | 0.8588 | 0.8823 | 0.8704 | 0.7988 |
|
66 |
+
| 0.0004 | 105.2632 | 2000 | 1.6804 | {'precision': 0.8596491228070176, 'recall': 0.8996328029375765, 'f1': 0.8791866028708135, 'number': 817} | {'precision': 0.5203252032520326, 'recall': 0.5378151260504201, 'f1': 0.5289256198347108, 'number': 119} | {'precision': 0.8869801084990958, 'recall': 0.9108635097493036, 'f1': 0.8987631699496106, 'number': 1077} | 0.8541 | 0.8843 | 0.8689 | 0.8032 |
|
67 |
+
| 0.0003 | 115.7895 | 2200 | 1.6352 | {'precision': 0.8713105076741441, 'recall': 0.9033047735618115, 'f1': 0.8870192307692307, 'number': 817} | {'precision': 0.6153846153846154, 'recall': 0.5378151260504201, 'f1': 0.5739910313901345, 'number': 119} | {'precision': 0.8938848920863309, 'recall': 0.9229340761374187, 'f1': 0.9081772498857925, 'number': 1077} | 0.8706 | 0.8922 | 0.8813 | 0.8066 |
|
68 |
+
| 0.0002 | 126.3158 | 2400 | 1.6323 | {'precision': 0.8683901292596945, 'recall': 0.9045287637698899, 'f1': 0.8860911270983215, 'number': 817} | {'precision': 0.6095238095238096, 'recall': 0.5378151260504201, 'f1': 0.5714285714285715, 'number': 119} | {'precision': 0.90063233965673, 'recall': 0.9257195914577531, 'f1': 0.913003663003663, 'number': 1077} | 0.8725 | 0.8942 | 0.8832 | 0.8071 |
|
69 |
|
70 |
|
71 |
### Framework versions
|
logs/events.out.tfevents.1719324686.Allianz-Editique.1964.0
CHANGED
@@ -1,3 +1,3 @@
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
-
oid sha256:
|
3 |
-
size
|
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:a85238cc1c862c122110f139d2afb55feaa39b8d8fb8ad8f70f2e69e731c1287
|
3 |
+
size 14182
|
logs/events.out.tfevents.1719327221.Allianz-Editique.2290.0
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:8c9c6bc86fc596f13c14eab264f16f71f21c23a60689532eb09247a791265ba6
|
3 |
+
size 5248
|
logs/events.out.tfevents.1719343104.Allianz-Editique.3019.0
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:ea0ac4e9a13855e426d03bb133ed117eefe606a0cb3c255ab028085f4aee918a
|
3 |
+
size 5249
|
logs/events.out.tfevents.1719343147.Allianz-Editique.3155.0
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:b6de03e5a3f798cb5be15e8f373fe38a9b13c6551c92103500789883ff4b1610
|
3 |
+
size 5249
|
logs/events.out.tfevents.1719343178.Allianz-Editique.3263.0
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:b4c0fae22fbcf76bba24fd2eb209645717eaf9fbfdbabd6d97cf63906193a304
|
3 |
+
size 5248
|
logs/events.out.tfevents.1719343262.Allianz-Editique.3404.0
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:0cb430da5d5cab079849760539a7009dede5bf3b68b430af6ca1a054fc73857b
|
3 |
+
size 5248
|
logs/events.out.tfevents.1719343334.Allianz-Editique.3488.0
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:7d9aa91b3d17fa46ffe55fd760660469f5867f878ca81d1ea9f138ab5d3921bb
|
3 |
+
size 5248
|
logs/events.out.tfevents.1719343371.Allianz-Editique.3545.0
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:b17d3cb536729e4aebbfa2f6c7848b4fb9d23c39ca2a6c0e1fc6fe14e33977b5
|
3 |
+
size 5249
|
logs/events.out.tfevents.1719343400.Allianz-Editique.3601.0
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:5d92473bae6545a672213c06612a7846830d7b1d329c0d1b138939c02cb79825
|
3 |
+
size 5963
|
model.safetensors
CHANGED
@@ -1,3 +1,3 @@
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
-
oid sha256:
|
3 |
size 520727564
|
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:81e020b99027afbee6d124338076c5bc9b72d2d68ea799f224dd6989bd56ec08
|
3 |
size 520727564
|
training_args.bin
CHANGED
@@ -1,3 +1,3 @@
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
-
oid sha256:
|
3 |
size 5176
|
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:6369f95a7a9609c09fd1325036b01bed5342c7c17a44f6cc72c434aa2c766364
|
3 |
size 5176
|