Transaminitis_M2_1000rate_1e6_SFT
This model is a fine-tuned version of mistralai/Mistral-7B-Instruct-v0.2 on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.3276
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-06
- train_batch_size: 2
- eval_batch_size: 1
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 4
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: cosine
- lr_scheduler_warmup_steps: 100
- training_steps: 1000
Training results
Training Loss | Epoch | Step | Validation Loss |
---|---|---|---|
1.6745 | 0.2 | 25 | 0.5115 |
0.3639 | 0.4 | 50 | 0.3423 |
0.2961 | 0.6 | 75 | 0.3021 |
0.2895 | 0.8 | 100 | 0.2865 |
0.2911 | 1.0 | 125 | 0.2715 |
0.2869 | 1.2 | 150 | 0.2590 |
0.2352 | 1.4 | 175 | 0.2345 |
0.2329 | 1.6 | 200 | 0.2288 |
0.2228 | 1.8 | 225 | 0.2251 |
0.2177 | 2.0 | 250 | 0.2227 |
0.2179 | 2.2 | 275 | 0.2200 |
0.2137 | 2.4 | 300 | 0.2208 |
0.2144 | 2.6 | 325 | 0.2271 |
0.2125 | 2.8 | 350 | 0.2190 |
0.2095 | 3.0 | 375 | 0.2174 |
0.2019 | 3.2 | 400 | 0.2230 |
0.1978 | 3.4 | 425 | 0.2198 |
0.2078 | 3.6 | 450 | 0.2174 |
0.2045 | 3.8 | 475 | 0.2226 |
0.2014 | 4.0 | 500 | 0.2205 |
0.1785 | 4.2 | 525 | 0.2476 |
0.1849 | 4.4 | 550 | 0.2391 |
0.1808 | 4.6 | 575 | 0.2435 |
0.1845 | 4.8 | 600 | 0.2456 |
0.1766 | 5.0 | 625 | 0.2450 |
0.1379 | 5.2 | 650 | 0.2867 |
0.1406 | 5.4 | 675 | 0.2916 |
0.1345 | 5.6 | 700 | 0.2916 |
0.1395 | 5.8 | 725 | 0.2895 |
0.141 | 6.0 | 750 | 0.2903 |
0.1141 | 6.2 | 775 | 0.3150 |
0.1125 | 6.4 | 800 | 0.3215 |
0.1094 | 6.6 | 825 | 0.3244 |
0.1123 | 6.8 | 850 | 0.3259 |
0.1148 | 7.0 | 875 | 0.3260 |
0.1062 | 7.2 | 900 | 0.3268 |
0.1041 | 7.4 | 925 | 0.3270 |
0.1031 | 7.6 | 950 | 0.3271 |
0.1044 | 7.8 | 975 | 0.3277 |
0.1012 | 8.0 | 1000 | 0.3276 |
Framework versions
- Transformers 4.40.2
- Pytorch 2.0.0+cu117
- Datasets 2.19.1
- Tokenizers 0.19.1
- Downloads last month
- 2
Inference Providers
NEW
This model is not currently available via any of the supported third-party Inference Providers, and
the model is not deployed on the HF Inference API.
Model tree for tsavage68/Transaminitis_M2_1000rate_1e6_SFT
Base model
mistralai/Mistral-7B-Instruct-v0.2