File size: 5,011 Bytes
d5ee4ac
999d128
d5ee4ac
 
999d128
 
d5ee4ac
 
 
 
 
 
 
 
 
 
999d128
 
a4fa04a
 
d5ee4ac
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
a4fa04a
d5ee4ac
 
 
 
 
a4fa04a
d5ee4ac
999d128
 
 
 
a4fa04a
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
999d128
 
d5ee4ac
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
---
license: mit
tags:
- generated_from_trainer
metrics:
- f1
model-index:
- name: Greg-Sentiment-classifier
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# Greg-Sentiment-classifier

This model is a fine-tuned version of [microsoft/MiniLM-L12-H384-uncased](https://huggingface.co./microsoft/MiniLM-L12-H384-uncased) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.7903
- F1: 0.7123

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 2

### Training results

| Training Loss | Epoch | Step | Validation Loss | F1     |
|:-------------:|:-----:|:----:|:---------------:|:------:|
| 0.6542        | 0.03  | 16   | 0.8127          | 0.6943 |
| 0.574         | 0.06  | 32   | 0.7935          | 0.6969 |
| 0.7078        | 0.1   | 48   | 0.8009          | 0.6957 |
| 0.5408        | 0.13  | 64   | 0.7863          | 0.7014 |
| 0.5343        | 0.16  | 80   | 0.8077          | 0.6996 |
| 0.5768        | 0.19  | 96   | 0.8239          | 0.6900 |
| 0.6633        | 0.22  | 112  | 0.7942          | 0.7028 |
| 0.5409        | 0.26  | 128  | 0.8025          | 0.6960 |
| 0.7043        | 0.29  | 144  | 0.7954          | 0.6813 |
| 0.5358        | 0.32  | 160  | 0.8058          | 0.6988 |
| 0.5558        | 0.35  | 176  | 0.8476          | 0.6799 |
| 0.5759        | 0.38  | 192  | 0.8232          | 0.7017 |
| 0.596         | 0.42  | 208  | 0.8240          | 0.7072 |
| 0.5977        | 0.45  | 224  | 0.8683          | 0.6947 |
| 0.5655        | 0.48  | 240  | 0.8378          | 0.7053 |
| 0.6274        | 0.51  | 256  | 0.8175          | 0.6980 |
| 0.4952        | 0.55  | 272  | 0.8203          | 0.6957 |
| 0.6501        | 0.58  | 288  | 0.8236          | 0.7014 |
| 0.5365        | 0.61  | 304  | 0.8082          | 0.7059 |
| 0.5598        | 0.64  | 320  | 0.8052          | 0.7121 |
| 0.5692        | 0.67  | 336  | 0.7989          | 0.7075 |
| 0.526         | 0.71  | 352  | 0.8030          | 0.6961 |
| 0.5505        | 0.74  | 368  | 0.8157          | 0.7137 |
| 0.4759        | 0.77  | 384  | 0.8466          | 0.6937 |
| 0.6622        | 0.8   | 400  | 0.8518          | 0.6982 |
| 0.6298        | 0.83  | 416  | 0.8272          | 0.6976 |
| 0.6311        | 0.87  | 432  | 0.8445          | 0.6793 |
| 0.5678        | 0.9   | 448  | 0.8096          | 0.6897 |
| 0.6687        | 0.93  | 464  | 0.7948          | 0.6968 |
| 0.6654        | 0.96  | 480  | 0.8047          | 0.7076 |
| 0.6572        | 0.99  | 496  | 0.7944          | 0.7037 |
| 0.5845        | 1.03  | 512  | 0.7772          | 0.7030 |
| 0.6611        | 1.06  | 528  | 0.7829          | 0.7005 |
| 0.4988        | 1.09  | 544  | 0.7953          | 0.7070 |
| 0.6355        | 1.12  | 560  | 0.8252          | 0.6983 |
| 0.5464        | 1.15  | 576  | 0.8293          | 0.7044 |
| 0.6188        | 1.19  | 592  | 0.8077          | 0.7073 |
| 0.5125        | 1.22  | 608  | 0.7975          | 0.7041 |
| 0.6221        | 1.25  | 624  | 0.7947          | 0.7041 |
| 0.5806        | 1.28  | 640  | 0.8027          | 0.6983 |
| 0.6335        | 1.31  | 656  | 0.7992          | 0.7027 |
| 0.6283        | 1.35  | 672  | 0.7836          | 0.7055 |
| 0.6485        | 1.38  | 688  | 0.7891          | 0.7104 |
| 0.5596        | 1.41  | 704  | 0.8146          | 0.7015 |
| 0.4928        | 1.44  | 720  | 0.7998          | 0.7088 |
| 0.5809        | 1.47  | 736  | 0.7850          | 0.7056 |
| 0.5117        | 1.51  | 752  | 0.7994          | 0.7053 |
| 0.6012        | 1.54  | 768  | 0.7960          | 0.7081 |
| 0.5213        | 1.57  | 784  | 0.8109          | 0.7034 |
| 0.6018        | 1.6   | 800  | 0.7927          | 0.7134 |
| 0.5851        | 1.64  | 816  | 0.7978          | 0.7108 |
| 0.6571        | 1.67  | 832  | 0.8131          | 0.7004 |
| 0.5215        | 1.7   | 848  | 0.7942          | 0.7146 |
| 0.5372        | 1.73  | 864  | 0.7957          | 0.7110 |
| 0.5511        | 1.76  | 880  | 0.7915          | 0.7138 |
| 0.5991        | 1.8   | 896  | 0.7899          | 0.7121 |
| 0.6128        | 1.83  | 912  | 0.7879          | 0.7136 |
| 0.5493        | 1.86  | 928  | 0.7960          | 0.7099 |
| 0.6304        | 1.89  | 944  | 0.7924          | 0.7102 |
| 0.4456        | 1.92  | 960  | 0.7904          | 0.7126 |
| 0.5484        | 1.96  | 976  | 0.7906          | 0.7127 |
| 0.515         | 1.99  | 992  | 0.7903          | 0.7123 |


### Framework versions

- Transformers 4.30.2
- Pytorch 2.0.1+cu118
- Datasets 2.13.1
- Tokenizers 0.13.3