--- library_name: transformers license: apache-2.0 base_model: bert-base-uncased tags: - generated_from_trainer model-index: - name: bert-base-uncased-nsp-5000-1e-06-8 results: [] --- # bert-base-uncased-nsp-5000-1e-06-8 This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co./bert-base-uncased) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.4119 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-06 - train_batch_size: 32 - eval_batch_size: 1024 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 30 ### Training results | Training Loss | Epoch | Step | Validation Loss | |:-------------:|:-----:|:----:|:---------------:| | No log | 1.0 | 157 | 0.6923 | | 0.6979 | 2.0 | 314 | 0.6900 | | 0.6928 | 3.0 | 471 | 0.6867 | | 0.6891 | 4.0 | 628 | 0.6805 | | 0.6891 | 5.0 | 785 | 0.6697 | | 0.6805 | 6.0 | 942 | 0.6527 | | 0.6637 | 7.0 | 1099 | 0.6301 | | 0.6388 | 8.0 | 1256 | 0.6087 | | 0.6103 | 9.0 | 1413 | 0.5909 | | 0.6103 | 10.0 | 1570 | 0.5760 | | 0.5846 | 11.0 | 1727 | 0.5624 | | 0.5665 | 12.0 | 1884 | 0.5438 | | 0.5322 | 13.0 | 2041 | 0.5249 | | 0.5322 | 14.0 | 2198 | 0.5029 | | 0.5096 | 15.0 | 2355 | 0.4886 | | 0.4759 | 16.0 | 2512 | 0.4672 | | 0.4457 | 17.0 | 2669 | 0.4581 | | 0.4306 | 18.0 | 2826 | 0.4493 | | 0.4306 | 19.0 | 2983 | 0.4413 | | 0.4073 | 20.0 | 3140 | 0.4331 | | 0.3974 | 21.0 | 3297 | 0.4267 | | 0.3789 | 22.0 | 3454 | 0.4252 | | 0.3615 | 23.0 | 3611 | 0.4200 | | 0.3615 | 24.0 | 3768 | 0.4183 | | 0.356 | 25.0 | 3925 | 0.4164 | | 0.3588 | 26.0 | 4082 | 0.4152 | | 0.3465 | 27.0 | 4239 | 0.4146 | | 0.3465 | 28.0 | 4396 | 0.4133 | | 0.3431 | 29.0 | 4553 | 0.4123 | | 0.3451 | 30.0 | 4710 | 0.4119 | ### Framework versions - Transformers 4.44.2 - Pytorch 2.4.0+cu121 - Datasets 2.21.0 - Tokenizers 0.19.1