--- license: apache-2.0 tags: - generated_from_keras_callback model-index: - name: LovenOO/distilBERT_without_preprocessing_grid_search results: [] --- # LovenOO/distilBERT_without_preprocessing_grid_search This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co./distilbert-base-uncased) on an unknown dataset. It achieves the following results on the evaluation set: - Train Loss: 0.0863 - Validation Loss: 0.5369 - Train Precision: 0.7209 - Train Recall: 0.6880 - Train F1: 0.6970 - Train Accuracy: 0.8654 - Epoch: 8 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - optimizer: {'name': 'Adam', 'learning_rate': {'module': 'keras.optimizers.schedules', 'class_name': 'PolynomialDecay', 'config': {'initial_learning_rate': 3e-05, 'decay_steps': 5140, 'end_learning_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}, 'registered_name': None}, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False} - training_precision: float32 ### Training results | Train Loss | Validation Loss | Train Precision | Train Recall | Train F1 | Train Accuracy | Epoch | |:----------:|:---------------:|:---------------:|:------------:|:--------:|:--------------:|:-----:| | 1.1440 | 0.6108 | 0.5739 | 0.5788 | 0.5727 | 0.8367 | 0 | | 0.5050 | 0.5447 | 0.6506 | 0.6243 | 0.6325 | 0.8489 | 1 | | 0.3584 | 0.4823 | 0.6518 | 0.6619 | 0.6558 | 0.8620 | 2 | | 0.2612 | 0.4890 | 0.7183 | 0.6777 | 0.6902 | 0.8654 | 3 | | 0.1901 | 0.4922 | 0.7137 | 0.6880 | 0.6937 | 0.8639 | 4 | | 0.1566 | 0.5050 | 0.7220 | 0.6838 | 0.6953 | 0.8703 | 5 | | 0.1189 | 0.5284 | 0.7088 | 0.6911 | 0.6920 | 0.8712 | 6 | | 0.1059 | 0.5285 | 0.7113 | 0.6835 | 0.6900 | 0.8635 | 7 | | 0.0863 | 0.5369 | 0.7209 | 0.6880 | 0.6970 | 0.8654 | 8 | ### Framework versions - Transformers 4.24.0 - TensorFlow 2.13.0 - Datasets 2.14.2 - Tokenizers 0.11.0