Update @ 2024.03.07 Open-platypus-SOLAR-10.7B-v1.0 This model is a fine-tuned version of upstage/SOLAR-10.7B-v1.0 Training hyperparameters The following hyperparameters were used during training: batch_size = 16 num_epochs = 1 micro_batch = 1 cutoff_len = 4096 learning_rate = 4e-4 Framework versions Transformers 4.34.1 Pytorch 2.1.0+cu121 Datasets 2.13.0 Tokenizers 0.14.1