--- base_model: TinyLlama/TinyLlama_v1.1 library_name: peft license: apache-2.0 tags: - unsloth - generated_from_trainer model-index: - name: tinyllama_magiccoder_ortho results: [] --- # tinyllama_magiccoder_ortho This model is a fine-tuned version of [TinyLlama/TinyLlama_v1.1](https://huggingface.co./TinyLlama/TinyLlama_v1.1) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 1.4401 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0001 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - gradient_accumulation_steps: 8 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: cosine - lr_scheduler_warmup_ratio: 0.02 - num_epochs: 1 ### Training results | Training Loss | Epoch | Step | Validation Loss | |:-------------:|:------:|:----:|:---------------:| | 1.8608 | 0.0262 | 4 | 1.8810 | | 1.7875 | 0.0523 | 8 | 1.6839 | | 1.6211 | 0.0785 | 12 | 1.6379 | | 1.5275 | 0.1047 | 16 | 1.5652 | | 1.5674 | 0.1308 | 20 | 1.5608 | | 1.4798 | 0.1570 | 24 | 1.5422 | | 1.484 | 0.1832 | 28 | 1.5231 | | 1.524 | 0.2093 | 32 | 1.5137 | | 1.4406 | 0.2355 | 36 | 1.5169 | | 1.5328 | 0.2617 | 40 | 1.5073 | | 1.5037 | 0.2878 | 44 | 1.4938 | | 1.5445 | 0.3140 | 48 | 1.4939 | | 1.5157 | 0.3401 | 52 | 1.4924 | | 1.4724 | 0.3663 | 56 | 1.4748 | | 1.5457 | 0.3925 | 60 | 1.4869 | | 1.5126 | 0.4186 | 64 | 1.4744 | | 1.4564 | 0.4448 | 68 | 1.4726 | | 1.4628 | 0.4710 | 72 | 1.4695 | | 1.4244 | 0.4971 | 76 | 1.4711 | | 1.5274 | 0.5233 | 80 | 1.4654 | | 1.5459 | 0.5495 | 84 | 1.4615 | | 1.4562 | 0.5756 | 88 | 1.4600 | | 1.3771 | 0.6018 | 92 | 1.4578 | | 1.3837 | 0.6280 | 96 | 1.4537 | | 1.46 | 0.6541 | 100 | 1.4546 | | 1.4711 | 0.6803 | 104 | 1.4554 | | 1.4257 | 0.7065 | 108 | 1.4455 | | 1.4661 | 0.7326 | 112 | 1.4473 | | 1.4269 | 0.7588 | 116 | 1.4469 | | 1.4269 | 0.7850 | 120 | 1.4433 | | 1.4514 | 0.8111 | 124 | 1.4461 | | 1.4349 | 0.8373 | 128 | 1.4458 | | 1.3174 | 0.8635 | 132 | 1.4409 | | 1.4861 | 0.8896 | 136 | 1.4399 | | 1.3536 | 0.9158 | 140 | 1.4408 | | 1.408 | 0.9419 | 144 | 1.4404 | | 1.4435 | 0.9681 | 148 | 1.4401 | | 1.4317 | 0.9943 | 152 | 1.4401 | ### Framework versions - PEFT 0.12.0 - Transformers 4.44.0 - Pytorch 2.4.0+cu121 - Datasets 2.20.0 - Tokenizers 0.19.1