ThatsGroes
commited on
Commit
•
969acd8
1
Parent(s):
cf9a387
Update README.md
Browse files
README.md
CHANGED
@@ -9,3 +9,7 @@ language:
|
|
9 |
|
10 |
# Munin-7b-alpha instruction fined tuned
|
11 |
Munin-7b-alpha from [Danish Foundation Models](https://www.foundationmodels.dk/) fine-tuned by [yours truly](https://www.linkedin.com/in/kaspergroesludvigsen/) for 1 epoch on [kobprof/skolegpt-instruct](https://huggingface.co/datasets/kobprof/skolegpt-instruct) using the code from [this notebook](https://github.com/alexandrainst/d3a-llm-workshop) by The Alexandra Institute
|
|
|
|
|
|
|
|
|
|
9 |
|
10 |
# Munin-7b-alpha instruction fined tuned
|
11 |
Munin-7b-alpha from [Danish Foundation Models](https://www.foundationmodels.dk/) fine-tuned by [yours truly](https://www.linkedin.com/in/kaspergroesludvigsen/) for 1 epoch on [kobprof/skolegpt-instruct](https://huggingface.co/datasets/kobprof/skolegpt-instruct) using the code from [this notebook](https://github.com/alexandrainst/d3a-llm-workshop) by The Alexandra Institute
|
12 |
+
|
13 |
+
Trained on a single Nvidia RTX A4000 GPU using 13.82 GB GPU memory (87.84%), of which 8.71 GB (55.39%) was used for LoRa.
|
14 |
+
|
15 |
+
The model trained for just shy of 4 hours consuming a total of 0.694 KWh (as per estimates produced with CodeCarbon) and emitting approximately 57 gCO2e (average CO2e emissions per KWh during training was 82.5 g as per https://www.energidataservice.dk/tso-electricity/CO2Emis)
|