hosting-lexical-10k
This model is a fine-tuned version of roberta-base on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.2278
- Accuracy: 0.9332
- F1: 0.9278
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0005
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 30
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 |
---|---|---|---|---|---|
13.1909 | 1.0 | 613 | 0.3549 | 0.9171 | 0.9177 |
4.0777 | 2.0 | 1226 | 0.2093 | 0.9394 | 0.9310 |
2.6943 | 3.0 | 1839 | 0.2833 | 0.9140 | 0.9158 |
2.2544 | 4.0 | 2452 | 0.2122 | 0.9302 | 0.9244 |
1.488 | 5.0 | 3065 | 0.2241 | 0.9340 | 0.9276 |
0.8386 | 6.0 | 3678 | 0.2202 | 0.9332 | 0.9271 |
1.4058 | 7.0 | 4291 | 0.2439 | 0.9233 | 0.9238 |
1.1671 | 8.0 | 4904 | 0.2386 | 0.9294 | 0.9253 |
1.2749 | 9.0 | 5517 | 0.2307 | 0.9355 | 0.9262 |
0.7176 | 10.0 | 6130 | 0.2207 | 0.9348 | 0.9273 |
0.7662 | 11.0 | 6743 | 0.2334 | 0.9317 | 0.9261 |
0.6094 | 12.0 | 7356 | 0.2609 | 0.9248 | 0.9205 |
1.2271 | 13.0 | 7969 | 0.2310 | 0.9363 | 0.9267 |
0.4984 | 14.0 | 8582 | 0.2321 | 0.9325 | 0.9273 |
0.8758 | 15.0 | 9195 | 0.2618 | 0.9363 | 0.9250 |
0.9482 | 16.0 | 9808 | 0.2416 | 0.9325 | 0.9244 |
0.7253 | 17.0 | 10421 | 0.2317 | 0.9302 | 0.9265 |
0.6299 | 18.0 | 11034 | 0.2376 | 0.9332 | 0.9256 |
0.686 | 19.0 | 11647 | 0.2213 | 0.9371 | 0.9264 |
0.799 | 20.0 | 12260 | 0.2387 | 0.9248 | 0.9224 |
0.7195 | 21.0 | 12873 | 0.2219 | 0.9355 | 0.9285 |
0.5301 | 22.0 | 13486 | 0.2250 | 0.9355 | 0.9270 |
0.6929 | 23.0 | 14099 | 0.2334 | 0.9340 | 0.9276 |
0.5645 | 24.0 | 14712 | 0.2219 | 0.9332 | 0.9271 |
0.6048 | 25.0 | 15325 | 0.2255 | 0.9378 | 0.9277 |
0.4626 | 26.0 | 15938 | 0.2201 | 0.9371 | 0.9295 |
0.7271 | 27.0 | 16551 | 0.2294 | 0.9348 | 0.9266 |
0.462 | 28.0 | 17164 | 0.2237 | 0.9332 | 0.9271 |
0.5708 | 29.0 | 17777 | 0.2253 | 0.9355 | 0.9285 |
0.3946 | 30.0 | 18390 | 0.2278 | 0.9332 | 0.9278 |
Framework versions
- Transformers 4.47.0
- Pytorch 2.5.1+cu121
- Datasets 3.2.0
- Tokenizers 0.21.0
- Downloads last month
- 160
Inference Providers
NEW
This model is not currently available via any of the supported third-party Inference Providers, and
HF Inference API was unable to determine this model’s pipeline type.
Model tree for nompahm/hosting-lexical-10k
Base model
FacebookAI/roberta-base