wcvz's picture
Model save
a0c442e verified
metadata
license: mit
library_name: peft
tags:
  - generated_from_trainer
base_model: facebook/esm2_t30_150M_UR50D
metrics:
  - accuracy
model-index:
  - name: esm2_t130_150M-lora-classifier_2024-04-26_10-08-51
    results: []

esm2_t130_150M-lora-classifier_2024-04-26_10-08-51

This model is a fine-tuned version of facebook/esm2_t30_150M_UR50D on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.4537
  • Accuracy: 0.8984

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0008701568055793088
  • train_batch_size: 28
  • eval_batch_size: 28
  • seed: 8893
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 30
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Accuracy
0.6764 1.0 55 0.6794 0.5820
0.5521 2.0 110 0.6192 0.6777
0.5409 3.0 165 0.5147 0.7383
0.5518 4.0 220 0.3518 0.8672
0.1386 5.0 275 0.3596 0.8574
0.303 6.0 330 0.4030 0.8359
0.1962 7.0 385 0.3143 0.8848
0.1501 8.0 440 0.3232 0.8652
0.2994 9.0 495 0.3014 0.8770
0.0914 10.0 550 0.2980 0.8887
0.2108 11.0 605 0.2854 0.8770
0.2896 12.0 660 0.3684 0.8691
0.0818 13.0 715 0.3349 0.8828
0.3152 14.0 770 0.3530 0.8848
0.0554 15.0 825 0.3371 0.8887
0.1928 16.0 880 0.3347 0.875
0.2658 17.0 935 0.3765 0.8867
0.4242 18.0 990 0.4166 0.8945
0.0964 19.0 1045 0.3400 0.8945
0.0375 20.0 1100 0.3581 0.9004
0.1781 21.0 1155 0.3816 0.8848
0.1563 22.0 1210 0.3940 0.8867
0.017 23.0 1265 0.4098 0.8926
0.1866 24.0 1320 0.4710 0.8770
0.0632 25.0 1375 0.4541 0.8828
0.1501 26.0 1430 0.4645 0.8828
0.109 27.0 1485 0.4434 0.8926
0.0353 28.0 1540 0.4264 0.8984
0.4502 29.0 1595 0.4479 0.8984
0.0341 30.0 1650 0.4537 0.8984

Framework versions

  • PEFT 0.10.0
  • Transformers 4.39.3
  • Pytorch 2.2.1
  • Datasets 2.16.1
  • Tokenizers 0.15.2