wcvz commited on
Commit
0103ddb
1 Parent(s): 63d33d0

Model save

Browse files
README.md ADDED
@@ -0,0 +1,91 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: mit
3
+ library_name: peft
4
+ tags:
5
+ - generated_from_trainer
6
+ base_model: facebook/esm2_t12_35M_UR50D
7
+ metrics:
8
+ - accuracy
9
+ model-index:
10
+ - name: esm2_t12_35M-lora-binding-sites_2024-04-25_14-35-31
11
+ results: []
12
+ ---
13
+
14
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
15
+ should probably proofread and complete it, then remove this comment. -->
16
+
17
+ # esm2_t12_35M-lora-binding-sites_2024-04-25_14-35-31
18
+
19
+ This model is a fine-tuned version of [facebook/esm2_t12_35M_UR50D](https://huggingface.co/facebook/esm2_t12_35M_UR50D) on the None dataset.
20
+ It achieves the following results on the evaluation set:
21
+ - Loss: 0.3589
22
+ - Accuracy: 0.8457
23
+
24
+ ## Model description
25
+
26
+ More information needed
27
+
28
+ ## Intended uses & limitations
29
+
30
+ More information needed
31
+
32
+ ## Training and evaluation data
33
+
34
+ More information needed
35
+
36
+ ## Training procedure
37
+
38
+ ### Training hyperparameters
39
+
40
+ The following hyperparameters were used during training:
41
+ - learning_rate: 0.0005701568055793089
42
+ - train_batch_size: 64
43
+ - eval_batch_size: 64
44
+ - seed: 8893
45
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
46
+ - lr_scheduler_type: cosine
47
+ - num_epochs: 30
48
+
49
+ ### Training results
50
+
51
+ | Training Loss | Epoch | Step | Validation Loss | Accuracy |
52
+ |:-------------:|:-----:|:----:|:---------------:|:--------:|
53
+ | 0.6703 | 1.0 | 24 | 0.6807 | 0.5820 |
54
+ | 0.6449 | 2.0 | 48 | 0.6703 | 0.5820 |
55
+ | 0.6659 | 3.0 | 72 | 0.6458 | 0.5977 |
56
+ | 0.6432 | 4.0 | 96 | 0.6612 | 0.6328 |
57
+ | 0.6322 | 5.0 | 120 | 0.6051 | 0.6523 |
58
+ | 0.6176 | 6.0 | 144 | 0.6062 | 0.6504 |
59
+ | 0.4904 | 7.0 | 168 | 0.5762 | 0.6777 |
60
+ | 0.4426 | 8.0 | 192 | 0.5784 | 0.6953 |
61
+ | 0.6014 | 9.0 | 216 | 0.5497 | 0.7148 |
62
+ | 0.4484 | 10.0 | 240 | 0.5399 | 0.7227 |
63
+ | 0.552 | 11.0 | 264 | 0.5142 | 0.7480 |
64
+ | 0.3581 | 12.0 | 288 | 0.4395 | 0.7930 |
65
+ | 0.3604 | 13.0 | 312 | 0.4201 | 0.8066 |
66
+ | 0.2733 | 14.0 | 336 | 0.4107 | 0.8262 |
67
+ | 0.2539 | 15.0 | 360 | 0.4373 | 0.8008 |
68
+ | 0.3538 | 16.0 | 384 | 0.3954 | 0.8301 |
69
+ | 0.4363 | 17.0 | 408 | 0.3852 | 0.8320 |
70
+ | 0.3433 | 18.0 | 432 | 0.3735 | 0.8418 |
71
+ | 0.2758 | 19.0 | 456 | 0.3685 | 0.8438 |
72
+ | 0.2073 | 20.0 | 480 | 0.3860 | 0.8262 |
73
+ | 0.3578 | 21.0 | 504 | 0.3689 | 0.8301 |
74
+ | 0.3114 | 22.0 | 528 | 0.3626 | 0.8418 |
75
+ | 0.3296 | 23.0 | 552 | 0.3621 | 0.8438 |
76
+ | 0.276 | 24.0 | 576 | 0.3602 | 0.8457 |
77
+ | 0.2583 | 25.0 | 600 | 0.3622 | 0.8457 |
78
+ | 0.1917 | 26.0 | 624 | 0.3597 | 0.8477 |
79
+ | 0.3588 | 27.0 | 648 | 0.3603 | 0.8477 |
80
+ | 0.219 | 28.0 | 672 | 0.3606 | 0.8438 |
81
+ | 0.3091 | 29.0 | 696 | 0.3586 | 0.8457 |
82
+ | 0.2235 | 30.0 | 720 | 0.3589 | 0.8457 |
83
+
84
+
85
+ ### Framework versions
86
+
87
+ - PEFT 0.10.0
88
+ - Transformers 4.39.3
89
+ - Pytorch 2.2.1
90
+ - Datasets 2.16.1
91
+ - Tokenizers 0.15.2
adapter_model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:23cb62358d3a63671d89455617734e05653c8aa53310dd742639246d43ecc8e2
3
  size 1214240
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:587fe8f65b8744f36593cbddf23613508ddc65bbe4c79a48a051dcc3ccea0c8a
3
  size 1214240
runs/Apr25_14-35-31_c0005/events.out.tfevents.1714070132.c0005.706944.17 CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:50c54b2fc851ddaab517f17829924fa8b01a7c9da76eee1b747eb86adaf9014b
3
- size 160594
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:7079f416a46cdb0c7433d50b59255388f4518a40c18e7387f66b350c1577a071
3
+ size 166335