File size: 12,825 Bytes
c797aa1
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
---
license: apache-2.0
base_model: google-bert/bert-large-uncased
tags:
- generated_from_trainer
metrics:
- accuracy
- f1
- precision
- recall
model-index:
- name: Intent-classification-BERT-Large-Ashuv3
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# Intent-classification-BERT-Large-Ashuv3

This model is a fine-tuned version of [google-bert/bert-large-uncased](https://huggingface.co./google-bert/bert-large-uncased) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2610
- Accuracy: 0.8951
- F1: 0.8807
- Precision: 0.8812
- Recall: 0.8820

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 16
- eval_batch_size: 32
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 30

### Training results

| Training Loss | Epoch | Step | Validation Loss | Accuracy | F1     | Precision | Recall |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:------:|:---------:|:------:|
| 1.6762        | 0.24  | 10   | 1.3120          | 0.5280   | 0.4993 | 0.6178    | 0.5370 |
| 0.9717        | 0.49  | 20   | 0.7487          | 0.8571   | 0.8402 | 0.8670    | 0.8455 |
| 0.6375        | 0.73  | 30   | 0.4393          | 0.8509   | 0.8479 | 0.8862    | 0.8548 |
| 0.4006        | 0.98  | 40   | 0.2427          | 0.9068   | 0.9005 | 0.9228    | 0.9075 |
| 0.2291        | 1.22  | 50   | 0.1875          | 0.9068   | 0.8940 | 0.9106    | 0.8902 |
| 0.2634        | 1.46  | 60   | 0.2204          | 0.9068   | 0.8977 | 0.9135    | 0.9051 |
| 0.1916        | 1.71  | 70   | 0.1730          | 0.9130   | 0.9053 | 0.9232    | 0.9123 |
| 0.1881        | 1.95  | 80   | 0.1676          | 0.9130   | 0.9051 | 0.9232    | 0.9133 |
| 0.2692        | 2.2   | 90   | 0.1728          | 0.9068   | 0.8958 | 0.9423    | 0.8790 |
| 0.1425        | 2.44  | 100  | 0.1757          | 0.9068   | 0.8958 | 0.9423    | 0.8790 |
| 0.2674        | 2.68  | 110  | 0.3307          | 0.8758   | 0.8608 | 0.8756    | 0.8713 |
| 0.2385        | 2.93  | 120  | 0.1878          | 0.9006   | 0.8901 | 0.9059    | 0.8988 |
| 0.1868        | 3.17  | 130  | 0.1679          | 0.9130   | 0.9027 | 0.9147    | 0.9097 |
| 0.2281        | 3.41  | 140  | 0.1796          | 0.9130   | 0.9057 | 0.9274    | 0.9133 |
| 0.1459        | 3.66  | 150  | 0.1982          | 0.9068   | 0.8960 | 0.9077    | 0.9049 |
| 0.161         | 3.9   | 160  | 0.2266          | 0.8944   | 0.8772 | 0.9012    | 0.8765 |
| 0.1441        | 4.15  | 170  | 0.2062          | 0.8944   | 0.8889 | 0.9115    | 0.8935 |
| 0.172         | 4.39  | 180  | 0.2208          | 0.9006   | 0.8922 | 0.9216    | 0.8988 |
| 0.1365        | 4.63  | 190  | 0.2088          | 0.9068   | 0.8974 | 0.9244    | 0.9045 |
| 0.1795        | 4.88  | 200  | 0.2011          | 0.8820   | 0.8682 | 0.8936    | 0.8569 |
| 0.204         | 5.12  | 210  | 0.2377          | 0.8820   | 0.8642 | 0.8656    | 0.8721 |
| 0.1409        | 5.37  | 220  | 0.2178          | 0.8944   | 0.8852 | 0.9003    | 0.8776 |
| 0.1771        | 5.61  | 230  | 0.2284          | 0.8758   | 0.8624 | 0.8871    | 0.8511 |
| 0.1926        | 5.85  | 240  | 0.2211          | 0.8944   | 0.8815 | 0.8990    | 0.8761 |
| 0.2142        | 6.1   | 250  | 0.2217          | 0.9193   | 0.9082 | 0.9306    | 0.9130 |
| 0.1125        | 6.34  | 260  | 0.2321          | 0.9006   | 0.8889 | 0.9420    | 0.8702 |
| 0.1473        | 6.59  | 270  | 0.2129          | 0.9130   | 0.9057 | 0.9274    | 0.9133 |
| 0.1468        | 6.83  | 280  | 0.2318          | 0.9130   | 0.9057 | 0.9274    | 0.9133 |
| 0.1951        | 7.07  | 290  | 0.1957          | 0.9006   | 0.8879 | 0.9061    | 0.8788 |
| 0.1659        | 7.32  | 300  | 0.1961          | 0.9006   | 0.8872 | 0.9143    | 0.8752 |
| 0.1265        | 7.56  | 310  | 0.2058          | 0.9130   | 0.9049 | 0.9226    | 0.9097 |
| 0.1774        | 7.8   | 320  | 0.2223          | 0.9068   | 0.8974 | 0.9244    | 0.9045 |
| 0.2609        | 8.05  | 330  | 0.2218          | 0.8944   | 0.8833 | 0.8906    | 0.8811 |
| 0.1079        | 8.29  | 340  | 0.3312          | 0.8820   | 0.8675 | 0.8672    | 0.8680 |
| 0.1729        | 8.54  | 350  | 0.3627          | 0.8696   | 0.8500 | 0.8540    | 0.8554 |
| 0.2337        | 8.78  | 360  | 0.2526          | 0.9006   | 0.8872 | 0.9143    | 0.8752 |
| 0.1573        | 9.02  | 370  | 0.2072          | 0.9130   | 0.9049 | 0.9226    | 0.9097 |
| 0.1843        | 9.27  | 380  | 0.2605          | 0.9068   | 0.8991 | 0.9210    | 0.9085 |
| 0.1521        | 9.51  | 390  | 0.2695          | 0.9006   | 0.8920 | 0.9081    | 0.8966 |
| 0.193         | 9.76  | 400  | 0.3340          | 0.9130   | 0.9039 | 0.9187    | 0.9061 |
| 0.1034        | 10.0  | 410  | 0.3391          | 0.9068   | 0.8948 | 0.9025    | 0.9049 |
| 0.1348        | 10.24 | 420  | 0.3377          | 0.9006   | 0.8902 | 0.8998    | 0.8930 |
| 0.0856        | 10.49 | 430  | 0.3274          | 0.8882   | 0.8768 | 0.8920    | 0.8692 |
| 0.1877        | 10.73 | 440  | 0.3401          | 0.8696   | 0.8498 | 0.8504    | 0.8514 |
| 0.1775        | 10.98 | 450  | 0.4162          | 0.8882   | 0.8708 | 0.8716    | 0.8799 |
| 0.1357        | 11.22 | 460  | 0.3992          | 0.8820   | 0.8652 | 0.8622    | 0.8716 |
| 0.0878        | 11.46 | 470  | 0.3920          | 0.8944   | 0.8803 | 0.8772    | 0.8883 |
| 0.1892        | 11.71 | 480  | 0.3148          | 0.8696   | 0.8499 | 0.8472    | 0.8549 |
| 0.1712        | 11.95 | 490  | 0.3028          | 0.8758   | 0.8589 | 0.8585    | 0.8597 |
| 0.0914        | 12.2  | 500  | 0.3450          | 0.8820   | 0.8688 | 0.8705    | 0.8680 |
| 0.1793        | 12.44 | 510  | 0.3617          | 0.8882   | 0.8758 | 0.8872    | 0.8692 |
| 0.1355        | 12.68 | 520  | 0.4130          | 0.8820   | 0.8688 | 0.8705    | 0.8680 |
| 0.1518        | 12.93 | 530  | 0.5015          | 0.8944   | 0.8798 | 0.8808    | 0.8878 |
| 0.1778        | 13.17 | 540  | 0.3596          | 0.8882   | 0.8716 | 0.8709    | 0.8804 |
| 0.1662        | 13.41 | 550  | 0.3716          | 0.9006   | 0.8864 | 0.8868    | 0.8930 |
| 0.1105        | 13.66 | 560  | 0.3452          | 0.9006   | 0.8874 | 0.8903    | 0.8966 |
| 0.1369        | 13.9  | 570  | 0.3606          | 0.8944   | 0.8807 | 0.8824    | 0.8883 |
| 0.2051        | 14.15 | 580  | 0.3497          | 0.8882   | 0.8750 | 0.8784    | 0.8728 |
| 0.1441        | 14.39 | 590  | 0.4031          | 0.8820   | 0.8664 | 0.8649    | 0.8680 |
| 0.1586        | 14.63 | 600  | 0.3853          | 0.8820   | 0.8664 | 0.8649    | 0.8680 |
| 0.0974        | 14.88 | 610  | 0.4037          | 0.8820   | 0.8664 | 0.8649    | 0.8680 |
| 0.0799        | 15.12 | 620  | 0.5252          | 0.8820   | 0.8688 | 0.8705    | 0.8680 |
| 0.0969        | 15.37 | 630  | 0.5702          | 0.8820   | 0.8691 | 0.8699    | 0.8716 |
| 0.1664        | 15.61 | 640  | 0.5281          | 0.8820   | 0.8688 | 0.8705    | 0.8680 |
| 0.175         | 15.85 | 650  | 0.4865          | 0.8820   | 0.8688 | 0.8705    | 0.8680 |
| 0.1904        | 16.1  | 660  | 0.3893          | 0.8696   | 0.8528 | 0.8520    | 0.8549 |
| 0.1054        | 16.34 | 670  | 0.4320          | 0.8758   | 0.8612 | 0.8636    | 0.8597 |
| 0.1657        | 16.59 | 680  | 0.5669          | 0.8820   | 0.8688 | 0.8705    | 0.8680 |
| 0.1089        | 16.83 | 690  | 0.5642          | 0.8820   | 0.8677 | 0.8649    | 0.8716 |
| 0.0831        | 17.07 | 700  | 0.4782          | 0.8820   | 0.8709 | 0.8744    | 0.8716 |
| 0.1518        | 17.32 | 710  | 0.5122          | 0.8820   | 0.8695 | 0.8720    | 0.8680 |
| 0.1203        | 17.56 | 720  | 0.5720          | 0.8820   | 0.8695 | 0.8720    | 0.8680 |
| 0.1185        | 17.8  | 730  | 0.5798          | 0.8820   | 0.8698 | 0.8703    | 0.8716 |
| 0.1065        | 18.05 | 740  | 0.5495          | 0.8820   | 0.8685 | 0.8701    | 0.8716 |
| 0.13          | 18.29 | 750  | 0.6271          | 0.8820   | 0.8687 | 0.8696    | 0.8716 |
| 0.1382        | 18.54 | 760  | 0.6307          | 0.8758   | 0.8585 | 0.8556    | 0.8633 |
| 0.0979        | 18.78 | 770  | 0.6167          | 0.8758   | 0.8585 | 0.8556    | 0.8633 |
| 0.1328        | 19.02 | 780  | 0.6011          | 0.8758   | 0.8585 | 0.8556    | 0.8633 |
| 0.1561        | 19.27 | 790  | 0.5938          | 0.8696   | 0.8517 | 0.8495    | 0.8549 |
| 0.1638        | 19.51 | 800  | 0.6397          | 0.8696   | 0.8528 | 0.8520    | 0.8549 |
| 0.1358        | 19.76 | 810  | 0.6917          | 0.8758   | 0.8614 | 0.8649    | 0.8597 |
| 0.1298        | 20.0  | 820  | 0.6769          | 0.8696   | 0.8528 | 0.8489    | 0.8585 |
| 0.1102        | 20.24 | 830  | 0.6891          | 0.8758   | 0.8610 | 0.8594    | 0.8669 |
| 0.127         | 20.49 | 840  | 0.6950          | 0.8820   | 0.8685 | 0.8701    | 0.8716 |
| 0.1719        | 20.73 | 850  | 0.6719          | 0.8882   | 0.8754 | 0.8773    | 0.8799 |
| 0.1503        | 20.98 | 860  | 0.6462          | 0.8820   | 0.8675 | 0.8666    | 0.8716 |
| 0.1118        | 21.22 | 870  | 0.6405          | 0.8820   | 0.8690 | 0.8705    | 0.8680 |
| 0.0991        | 21.46 | 880  | 0.6492          | 0.8758   | 0.8614 | 0.8600    | 0.8633 |
| 0.1288        | 21.71 | 890  | 0.7045          | 0.8820   | 0.8688 | 0.8705    | 0.8680 |
| 0.1414        | 21.95 | 900  | 0.7439          | 0.8820   | 0.8688 | 0.8705    | 0.8680 |
| 0.1744        | 22.2  | 910  | 0.7353          | 0.8820   | 0.8688 | 0.8705    | 0.8680 |
| 0.1072        | 22.44 | 920  | 0.7524          | 0.8820   | 0.8688 | 0.8705    | 0.8680 |
| 0.0931        | 22.68 | 930  | 0.7671          | 0.8758   | 0.8614 | 0.8649    | 0.8597 |
| 0.0775        | 22.93 | 940  | 0.7442          | 0.8758   | 0.8614 | 0.8649    | 0.8597 |
| 0.0713        | 23.17 | 950  | 0.7456          | 0.8758   | 0.8614 | 0.8649    | 0.8597 |
| 0.1027        | 23.41 | 960  | 0.7528          | 0.8820   | 0.8664 | 0.8649    | 0.8680 |
| 0.1163        | 23.66 | 970  | 0.7503          | 0.8820   | 0.8664 | 0.8649    | 0.8680 |
| 0.1067        | 23.9  | 980  | 0.7359          | 0.8758   | 0.8622 | 0.8660    | 0.8597 |
| 0.0955        | 24.15 | 990  | 0.7457          | 0.8820   | 0.8676 | 0.8687    | 0.8680 |
| 0.0874        | 24.39 | 1000 | 0.7663          | 0.8820   | 0.8685 | 0.8701    | 0.8716 |
| 0.0865        | 24.63 | 1010 | 0.7761          | 0.8820   | 0.8685 | 0.8701    | 0.8716 |
| 0.1378        | 24.88 | 1020 | 0.7761          | 0.8820   | 0.8691 | 0.8699    | 0.8716 |
| 0.1411        | 25.12 | 1030 | 0.7714          | 0.8820   | 0.8676 | 0.8687    | 0.8680 |
| 0.1034        | 25.37 | 1040 | 0.7662          | 0.8820   | 0.8685 | 0.8700    | 0.8680 |
| 0.0709        | 25.61 | 1050 | 0.7720          | 0.8820   | 0.8670 | 0.8681    | 0.8680 |
| 0.1286        | 25.85 | 1060 | 0.7809          | 0.8820   | 0.8670 | 0.8681    | 0.8680 |
| 0.1191        | 26.1  | 1070 | 0.7861          | 0.8820   | 0.8676 | 0.8687    | 0.8680 |
| 0.0902        | 26.34 | 1080 | 0.7888          | 0.8820   | 0.8691 | 0.8699    | 0.8716 |
| 0.1054        | 26.59 | 1090 | 0.7894          | 0.8820   | 0.8698 | 0.8703    | 0.8716 |
| 0.1142        | 26.83 | 1100 | 0.7914          | 0.8820   | 0.8691 | 0.8699    | 0.8716 |
| 0.1175        | 27.07 | 1110 | 0.7923          | 0.8820   | 0.8691 | 0.8699    | 0.8716 |
| 0.1319        | 27.32 | 1120 | 0.7938          | 0.8820   | 0.8685 | 0.8701    | 0.8716 |
| 0.1181        | 27.56 | 1130 | 0.7967          | 0.8820   | 0.8685 | 0.8701    | 0.8716 |
| 0.0858        | 27.8  | 1140 | 0.8003          | 0.8820   | 0.8685 | 0.8701    | 0.8716 |
| 0.0697        | 28.05 | 1150 | 0.8025          | 0.8820   | 0.8685 | 0.8701    | 0.8716 |
| 0.0644        | 28.29 | 1160 | 0.8050          | 0.8820   | 0.8685 | 0.8701    | 0.8716 |
| 0.1123        | 28.54 | 1170 | 0.8063          | 0.8820   | 0.8685 | 0.8701    | 0.8716 |
| 0.0998        | 28.78 | 1180 | 0.8078          | 0.8820   | 0.8685 | 0.8701    | 0.8716 |
| 0.1297        | 29.02 | 1190 | 0.8095          | 0.8820   | 0.8685 | 0.8701    | 0.8716 |
| 0.1133        | 29.27 | 1200 | 0.8094          | 0.8820   | 0.8685 | 0.8701    | 0.8716 |
| 0.1122        | 29.51 | 1210 | 0.8095          | 0.8820   | 0.8685 | 0.8701    | 0.8716 |
| 0.1115        | 29.76 | 1220 | 0.8096          | 0.8820   | 0.8685 | 0.8701    | 0.8716 |
| 0.0692        | 30.0  | 1230 | 0.8095          | 0.8820   | 0.8685 | 0.8701    | 0.8716 |


### Framework versions

- Transformers 4.38.2
- Pytorch 2.1.2
- Datasets 2.1.0
- Tokenizers 0.15.2