pythia-70m_tatsu-lab_alpaca_farm_sftsd0_policy_pythia-6.9b_gold_pythia-6.9b_noise0.2_rmsd4
This model is a fine-tuned version of RylanSchaeffer/EleutherAI_pythia-70m_tatsu-lab_alpaca_farm_sftseed0 on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.7361
- Accuracy: 0.5448
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 16
- eval_batch_size: 8
- seed: 4
- gradient_accumulation_steps: 2
- total_train_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.025
- num_epochs: 5
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy |
---|---|---|---|---|
No log | 0 | 0 | 0.8760 | 0.4952 |
0.8602 | 0.0648 | 100 | 0.8700 | 0.5052 |
0.8477 | 0.1295 | 200 | 0.8512 | 0.5075 |
0.8875 | 0.1943 | 300 | 0.8288 | 0.5106 |
0.8263 | 0.2591 | 400 | 0.8145 | 0.5171 |
0.7877 | 0.3238 | 500 | 0.8051 | 0.5079 |
0.7886 | 0.3886 | 600 | 0.7899 | 0.5344 |
0.7859 | 0.4534 | 700 | 0.7865 | 0.5263 |
0.8393 | 0.5181 | 800 | 0.7794 | 0.5225 |
0.7919 | 0.5829 | 900 | 0.7676 | 0.5336 |
0.741 | 0.6477 | 1000 | 0.7694 | 0.5267 |
0.7088 | 0.7124 | 1100 | 0.7634 | 0.5286 |
0.7542 | 0.7772 | 1200 | 0.7636 | 0.5256 |
0.7837 | 0.8420 | 1300 | 0.7582 | 0.5333 |
0.7461 | 0.9067 | 1400 | 0.7526 | 0.5348 |
0.7674 | 0.9715 | 1500 | 0.7537 | 0.5306 |
0.7818 | 1.0363 | 1600 | 0.7470 | 0.5409 |
0.7251 | 1.1010 | 1700 | 0.7470 | 0.5479 |
0.6958 | 1.1658 | 1800 | 0.7428 | 0.5375 |
0.7559 | 1.2306 | 1900 | 0.7444 | 0.5433 |
0.7135 | 1.2953 | 2000 | 0.7420 | 0.5471 |
0.7507 | 1.3601 | 2100 | 0.7433 | 0.5333 |
0.684 | 1.4249 | 2200 | 0.7445 | 0.5506 |
0.8107 | 1.4896 | 2300 | 0.7400 | 0.5459 |
0.7526 | 1.5544 | 2400 | 0.7438 | 0.5313 |
0.7405 | 1.6192 | 2500 | 0.7414 | 0.5336 |
0.7509 | 1.6839 | 2600 | 0.7365 | 0.5402 |
0.7731 | 1.7487 | 2700 | 0.7407 | 0.5436 |
0.7534 | 1.8135 | 2800 | 0.7369 | 0.5436 |
0.7607 | 1.8782 | 2900 | 0.7404 | 0.5340 |
0.7525 | 1.9430 | 3000 | 0.7366 | 0.5456 |
0.7276 | 2.0078 | 3100 | 0.7377 | 0.5463 |
0.7188 | 2.0725 | 3200 | 0.7393 | 0.5402 |
0.7393 | 2.1373 | 3300 | 0.7413 | 0.5479 |
0.7584 | 2.2021 | 3400 | 0.7395 | 0.5402 |
0.7131 | 2.2668 | 3500 | 0.7405 | 0.5459 |
0.719 | 2.3316 | 3600 | 0.7396 | 0.5402 |
0.7554 | 2.3964 | 3700 | 0.7400 | 0.5386 |
0.7405 | 2.4611 | 3800 | 0.7387 | 0.5440 |
0.7241 | 2.5259 | 3900 | 0.7370 | 0.5506 |
0.7639 | 2.5907 | 4000 | 0.7407 | 0.5367 |
0.7445 | 2.6554 | 4100 | 0.7424 | 0.5409 |
0.7208 | 2.7202 | 4200 | 0.7412 | 0.5394 |
0.6943 | 2.7850 | 4300 | 0.7399 | 0.5421 |
0.7786 | 2.8497 | 4400 | 0.7391 | 0.5444 |
0.6872 | 2.9145 | 4500 | 0.7378 | 0.5413 |
0.7482 | 2.9793 | 4600 | 0.7391 | 0.5436 |
0.7052 | 3.0440 | 4700 | 0.7387 | 0.5452 |
0.7101 | 3.1088 | 4800 | 0.7381 | 0.5448 |
0.7355 | 3.1736 | 4900 | 0.7390 | 0.5429 |
0.7271 | 3.2383 | 5000 | 0.7365 | 0.5402 |
0.7477 | 3.3031 | 5100 | 0.7413 | 0.5398 |
0.7566 | 3.3679 | 5200 | 0.7388 | 0.5398 |
0.7372 | 3.4326 | 5300 | 0.7401 | 0.5363 |
0.7258 | 3.4974 | 5400 | 0.7384 | 0.5467 |
0.7233 | 3.5622 | 5500 | 0.7389 | 0.5471 |
0.7402 | 3.6269 | 5600 | 0.7406 | 0.5483 |
0.7111 | 3.6917 | 5700 | 0.7341 | 0.5456 |
0.7417 | 3.7565 | 5800 | 0.7387 | 0.5352 |
0.7308 | 3.8212 | 5900 | 0.7381 | 0.5421 |
0.7259 | 3.8860 | 6000 | 0.7399 | 0.5521 |
0.7255 | 3.9508 | 6100 | 0.7407 | 0.5421 |
0.7549 | 4.0155 | 6200 | 0.7418 | 0.5383 |
0.7355 | 4.0803 | 6300 | 0.7380 | 0.5383 |
0.7696 | 4.1451 | 6400 | 0.7405 | 0.5440 |
0.7103 | 4.2098 | 6500 | 0.7396 | 0.5313 |
0.7509 | 4.2746 | 6600 | 0.7406 | 0.5483 |
0.7498 | 4.3394 | 6700 | 0.7372 | 0.5459 |
0.7517 | 4.4041 | 6800 | 0.7387 | 0.5494 |
0.7351 | 4.4689 | 6900 | 0.7407 | 0.5359 |
0.7631 | 4.5337 | 7000 | 0.7372 | 0.5406 |
0.734 | 4.5984 | 7100 | 0.7405 | 0.5517 |
0.7188 | 4.6632 | 7200 | 0.7354 | 0.5483 |
0.7522 | 4.7280 | 7300 | 0.7417 | 0.5429 |
0.7437 | 4.7927 | 7400 | 0.7397 | 0.5406 |
0.7688 | 4.8575 | 7500 | 0.7405 | 0.5379 |
0.6892 | 4.9223 | 7600 | 0.7383 | 0.5467 |
0.7512 | 4.9870 | 7700 | 0.7369 | 0.5517 |
Framework versions
- Transformers 4.43.2
- Pytorch 2.3.0+cu121
- Datasets 2.20.0
- Tokenizers 0.19.1
- Downloads last month
- 8
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.