AmelieSchreiber
commited on
Commit
•
f3dc552
1
Parent(s):
ce9ca82
Update README.md
Browse files
README.md
CHANGED
@@ -18,6 +18,17 @@ tags:
|
|
18 |
This model was trained with Hugging Face's Parameter Efficient Fine-Tuning (PEFT) library, in particular,
|
19 |
a Low Rank Adaptation (LoRA) was trained on top of the model
|
20 |
[AmelieSchreiber/esm2_t6_8M_finetuned_cafa5](https://huggingface.co/AmelieSchreiber/esm2_t6_8M_finetuned_cafa5).
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
21 |
|
22 |
```
|
23 |
Epoch 3/3
|
|
|
18 |
This model was trained with Hugging Face's Parameter Efficient Fine-Tuning (PEFT) library, in particular,
|
19 |
a Low Rank Adaptation (LoRA) was trained on top of the model
|
20 |
[AmelieSchreiber/esm2_t6_8M_finetuned_cafa5](https://huggingface.co/AmelieSchreiber/esm2_t6_8M_finetuned_cafa5).
|
21 |
+
For the details and the code for finetuning with Low Rank Adaptation please see
|
22 |
+
[this training notebook](https://huggingface.co/AmelieSchreiber/esm2_t6_8M_lora_cafa5/blob/main/cafa_5_finetune_v2.ipynb).
|
23 |
+
You may need to run a few `!pip install` statements to run the notebook. In particular, you will need
|
24 |
+
|
25 |
+
```python
|
26 |
+
!pip install transformers -q
|
27 |
+
!pip install accelerate -q
|
28 |
+
!pip install peft -q
|
29 |
+
```
|
30 |
+
|
31 |
+
This model has the following training metrics:
|
32 |
|
33 |
```
|
34 |
Epoch 3/3
|