--- license: apache-2.0 base_model: distilbert-base-uncased tags: - generated_from_trainer metrics: - precision - recall - f1 - accuracy model-index: - name: distilBERT_with_preprocessing_grid_search results: [] --- # distilBERT_with_preprocessing_grid_search This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co./distilbert-base-uncased) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.8637 - Precision: 0.8392 - Recall: 0.8339 - F1: 0.8360 - Accuracy: 0.8630 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 3e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 10 ### Training results | Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:---------:|:------:|:------:|:--------:| | 0.9492 | 1.0 | 510 | 0.5973 | 0.7572 | 0.8287 | 0.7836 | 0.8434 | | 0.4661 | 2.0 | 1020 | 0.5080 | 0.8146 | 0.8535 | 0.8311 | 0.8567 | | 0.2954 | 3.0 | 1530 | 0.6910 | 0.8283 | 0.8231 | 0.8245 | 0.8591 | | 0.2263 | 4.0 | 2040 | 0.7367 | 0.8448 | 0.8293 | 0.8363 | 0.8635 | | 0.1749 | 5.0 | 2550 | 0.7399 | 0.8402 | 0.8373 | 0.8383 | 0.8650 | | 0.1273 | 6.0 | 3060 | 0.7759 | 0.8352 | 0.8414 | 0.8377 | 0.8689 | | 0.1051 | 7.0 | 3570 | 0.8864 | 0.8375 | 0.8271 | 0.8308 | 0.8616 | | 0.0877 | 8.0 | 4080 | 0.8407 | 0.8327 | 0.8360 | 0.8335 | 0.8625 | | 0.0781 | 9.0 | 4590 | 0.8586 | 0.8345 | 0.8362 | 0.8345 | 0.8645 | | 0.0627 | 10.0 | 5100 | 0.8637 | 0.8392 | 0.8339 | 0.8360 | 0.8630 | ### Framework versions - Transformers 4.31.0 - Pytorch 2.0.1+cu118 - Datasets 2.14.4 - Tokenizers 0.13.3