wcvz's picture
Model save
7df6c86 verified
metadata
license: mit
library_name: peft
tags:
  - generated_from_trainer
base_model: facebook/esm2_t30_150M_UR50D
metrics:
  - accuracy
model-index:
  - name: esm2_t130_150M-lora-classifier_2024-04-25_21-48-08
    results: []

esm2_t130_150M-lora-classifier_2024-04-25_21-48-08

This model is a fine-tuned version of facebook/esm2_t30_150M_UR50D on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5189
  • Accuracy: 0.8809

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0005701568055793089
  • train_batch_size: 12
  • eval_batch_size: 12
  • seed: 8893
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: cosine
  • num_epochs: 30
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Accuracy
0.6192 1.0 128 0.6737 0.6055
0.4321 2.0 256 0.6507 0.6289
0.571 3.0 384 0.5572 0.7188
0.3053 4.0 512 0.5090 0.7852
0.5055 5.0 640 0.3370 0.8516
0.2786 6.0 768 0.3710 0.8594
0.1327 7.0 896 0.3055 0.8711
0.2127 8.0 1024 0.2891 0.8945
0.0913 9.0 1152 0.3454 0.8691
0.0134 10.0 1280 0.3354 0.8809
0.2597 11.0 1408 0.3436 0.8848
0.0276 12.0 1536 0.4181 0.8633
0.0929 13.0 1664 0.3722 0.8789
0.9377 14.0 1792 0.5086 0.8730
0.2894 15.0 1920 0.3311 0.8906
0.3138 16.0 2048 0.4739 0.8809
0.0088 17.0 2176 0.3875 0.8867
0.3591 18.0 2304 0.4032 0.8809
0.0436 19.0 2432 0.4316 0.8887
0.0037 20.0 2560 0.4931 0.8789
0.0322 21.0 2688 0.4787 0.8809
0.0035 22.0 2816 0.4460 0.8770
0.0859 23.0 2944 0.4914 0.8828
0.039 24.0 3072 0.4955 0.8770
0.4208 25.0 3200 0.5211 0.8828
0.1874 26.0 3328 0.5376 0.8711
0.4433 27.0 3456 0.5319 0.875
0.2976 28.0 3584 0.5201 0.8809
0.0223 29.0 3712 0.5179 0.8809
0.0021 30.0 3840 0.5189 0.8809

Framework versions

  • PEFT 0.10.0
  • Transformers 4.39.3
  • Pytorch 2.2.1
  • Datasets 2.16.1
  • Tokenizers 0.15.2