lapp0 commited on
Commit
e21936e
1 Parent(s): e192967

End of training

Browse files
README.md CHANGED
@@ -16,14 +16,14 @@ This student model is distilled from the teacher model [gpt2](https://huggingfac
16
  The [Distily](https://github.com/lapp0/distily) library was used for this distillation.
17
 
18
  It achieves the following results on the evaluation set:
19
- - eval_enwikippl: 99.0
20
- - eval_frwikippl: 408.0
21
- - eval_zhwikippl: 149.0
22
- - eval_tinystoriesppl: 74.5
23
- - eval_loss: 0.7768
24
- - eval_runtime: 16.7488
25
- - eval_samples_per_second: 59.706
26
- - eval_steps_per_second: 7.463
27
 
28
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
29
  should probably proofread and complete it, then remove this comment.
@@ -46,7 +46,7 @@ More information needed
46
  ### Training hyperparameters
47
 
48
  The following hyperparameters were used during training:
49
- - distillation_objective: DistillationObjective(logits_loss_component=LossComponent(label=logits, weight=1, loss_fn=kl, layer_mapper=None, projector=None), hs_loss_component=LossComponent(label=hs, weight=0, loss_fn=None, layer_mapper=None, projector=None), attn_loss_component=LossComponent(label=attn, weight=0, loss_fn=None, layer_mapper=None, projector=None))
50
  - train_embeddings: True
51
  - learning_rate: 0.0001
52
  - train_batch_size: 4
@@ -58,38 +58,38 @@ The following hyperparameters were used during training:
58
  - num_epochs: 1.0
59
 
60
  ### Resource Usage
61
- Peak GPU Memory: 7.4226 GB
62
 
63
  ### Eval-Phase Metrics
64
  | step | epoch | enwikippl | frwikippl | loss | runtime | samples_per_second | steps_per_second | tinystoriesppl | zhwikippl |
65
  | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
66
  | **teacher eval** | | 43.75 | 61.75 | | | | | 11.8125 | 19.125 |
67
- | 0 | 0 | 10027008.0 | 5734400.0 | 11.3280 | 16.7303 | 59.772 | 7.471 | 5865472.0 | 3637248.0 |
68
- | 1000 | 0.0404 | 382.0 | 2016.0 | 1.6270 | 16.7162 | 59.822 | 7.478 | 298.0 | 1200.0 |
69
- | 2000 | 0.0808 | 270.0 | 988.0 | 1.4340 | 16.7294 | 59.775 | 7.472 | 227.0 | 390.0 |
70
- | 3000 | 0.1212 | 218.0 | 736.0 | 1.2960 | 16.686 | 59.931 | 7.491 | 185.0 | 239.0 |
71
- | 4000 | 0.1616 | 182.0 | 708.0 | 1.1629 | 16.7199 | 59.809 | 7.476 | 144.0 | 213.0 |
72
- | 5000 | 0.2020 | 155.0 | 620.0 | 1.0601 | 16.7539 | 59.688 | 7.461 | 117.0 | 230.0 |
73
- | 6000 | 0.2424 | 131.0 | 496.0 | 0.9643 | 16.7329 | 59.763 | 7.47 | 104.5 | 193.0 |
74
- | 7000 | 0.2828 | 122.0 | 512.0 | 0.9001 | 16.7515 | 59.696 | 7.462 | 91.5 | 193.0 |
75
- | 8000 | 0.3232 | 111.5 | 428.0 | 0.8336 | 16.677 | 59.963 | 7.495 | 82.5 | 161.0 |
76
- | 9000 | 0.3636 | 99.0 | 408.0 | 0.7768 | 16.7488 | 59.706 | 7.463 | 74.5 | 149.0 |
77
- | 10000 | 0.4040 | 89.5 | 386.0 | 0.7219 | 16.6219 | 60.162 | 7.52 | 71.5 | 140.0 |
78
- | 11000 | 0.4444 | 82.5 | 332.0 | 0.6595 | 16.6753 | 59.969 | 7.496 | 67.0 | 148.0 |
79
- | 12000 | 0.4848 | 78.0 | 300.0 | 0.6302 | 16.6631 | 60.013 | 7.502 | 60.75 | 131.0 |
80
- | 13000 | 0.5253 | 73.5 | 292.0 | 0.6019 | 16.7166 | 59.821 | 7.478 | 59.5 | 117.0 |
81
- | 14000 | 0.5657 | 75.0 | 284.0 | 0.5861 | 16.7002 | 59.88 | 7.485 | 59.5 | 137.0 |
82
- | 15000 | 0.6061 | 71.5 | 252.0 | 0.5722 | 16.6732 | 59.976 | 7.497 | 55.25 | 130.0 |
83
- | 16000 | 0.6465 | 70.0 | 250.0 | 0.5545 | 16.6934 | 59.904 | 7.488 | 57.75 | 104.5 |
84
- | 17000 | 0.6869 | 70.0 | 272.0 | 0.5426 | 16.6888 | 59.92 | 7.49 | 55.5 | 130.0 |
85
- | 18000 | 0.7273 | 70.0 | 248.0 | 0.5380 | 16.6762 | 59.966 | 7.496 | 53.75 | 124.0 |
86
- | 19000 | 0.7677 | 68.5 | 227.0 | 0.5270 | 16.6682 | 59.994 | 7.499 | 53.5 | 96.5 |
87
- | 20000 | 0.8081 | 65.5 | 219.0 | 0.5260 | 16.6778 | 59.96 | 7.495 | 52.0 | 129.0 |
88
- | 21000 | 0.8485 | 68.0 | 228.0 | 0.5154 | 16.7388 | 59.741 | 7.468 | 52.0 | 140.0 |
89
- | 22000 | 0.8889 | 68.5 | 246.0 | 0.5128 | 16.6637 | 60.011 | 7.501 | 51.75 | 216.0 |
90
- | 23000 | 0.9293 | 64.5 | 245.0 | 0.5029 | 16.7201 | 59.808 | 7.476 | 52.5 | 146.0 |
91
- | 24000 | 0.9697 | 66.5 | 230.0 | 0.5067 | 16.7059 | 59.859 | 7.482 | 51.25 | 168.0 |
92
- | 24750 | 1.0 | 65.5 | 228.0 | 0.5042 | 16.685 | 59.934 | 7.492 | 51.25 | 100.5 |
93
 
94
  ### Framework versions
95
  - Distily 0.2.0
 
16
  The [Distily](https://github.com/lapp0/distily) library was used for this distillation.
17
 
18
  It achieves the following results on the evaluation set:
19
+ - eval_enwikippl: 84.0
20
+ - eval_frwikippl: 342.0
21
+ - eval_zhwikippl: 217.0
22
+ - eval_tinystoriesppl: 69.5
23
+ - eval_loss: 0.6877
24
+ - eval_runtime: 16.9969
25
+ - eval_samples_per_second: 58.834
26
+ - eval_steps_per_second: 7.354
27
 
28
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
29
  should probably proofread and complete it, then remove this comment.
 
46
  ### Training hyperparameters
47
 
48
  The following hyperparameters were used during training:
49
+ - distillation_objective: DistillationObjective(logits_loss_component=LossComponent(label=logits, weight=1, loss_fn=kl, layer_mapper=None, projector=None), hs_loss_component=LossComponent(label=hs, weight=1.0, loss_fn=mse, layer_mapper=last, projector=None), attn_loss_component=LossComponent(label=attn, weight=0, loss_fn=None, layer_mapper=None, projector=None))
50
  - train_embeddings: True
51
  - learning_rate: 0.0001
52
  - train_batch_size: 4
 
58
  - num_epochs: 1.0
59
 
60
  ### Resource Usage
61
+ Peak GPU Memory: 7.7252 GB
62
 
63
  ### Eval-Phase Metrics
64
  | step | epoch | enwikippl | frwikippl | loss | runtime | samples_per_second | steps_per_second | tinystoriesppl | zhwikippl |
65
  | --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
66
  | **teacher eval** | | 43.75 | 61.75 | | | | | 11.8125 | 19.125 |
67
+ | 0 | 0 | 2473901162496.0 | 170424302305280.0 | 20.7680 | 17.0409 | 58.682 | 7.335 | 4060086272.0 | 71468255805440.0 |
68
+ | 1000 | 0.0404 | 334.0 | 1464.0 | 1.5419 | 17.0178 | 58.762 | 7.345 | 243.0 | 596.0 |
69
+ | 2000 | 0.0808 | 232.0 | 756.0 | 1.3235 | 16.9755 | 58.909 | 7.364 | 189.0 | 250.0 |
70
+ | 3000 | 0.1212 | 180.0 | 628.0 | 1.1620 | 16.9923 | 58.85 | 7.356 | 149.0 | 171.0 |
71
+ | 4000 | 0.1616 | 150.0 | 576.0 | 1.0434 | 16.9803 | 58.892 | 7.361 | 121.5 | 172.0 |
72
+ | 5000 | 0.2020 | 130.0 | 504.0 | 0.9520 | 17.0128 | 58.779 | 7.347 | 100.5 | 144.0 |
73
+ | 6000 | 0.2424 | 113.5 | 420.0 | 0.8702 | 17.0074 | 58.798 | 7.35 | 91.0 | 137.0 |
74
+ | 7000 | 0.2828 | 106.0 | 408.0 | 0.8100 | 16.9821 | 58.885 | 7.361 | 80.5 | 160.0 |
75
+ | 8000 | 0.3232 | 96.5 | 396.0 | 0.7421 | 16.9749 | 58.911 | 7.364 | 70.5 | 127.0 |
76
+ | 9000 | 0.3636 | 84.0 | 342.0 | 0.6877 | 16.9969 | 58.834 | 7.354 | 69.5 | 217.0 |
77
+ | 10000 | 0.4040 | 78.0 | 300.0 | 0.6467 | 16.9846 | 58.877 | 7.36 | 65.0 | 139.0 |
78
+ | 11000 | 0.4444 | 77.0 | 278.0 | 0.5957 | 16.9903 | 58.857 | 7.357 | 60.0 | 127.5 |
79
+ | 12000 | 0.4848 | 75.0 | 272.0 | 0.5789 | 16.9858 | 58.873 | 7.359 | 56.5 | 140.0 |
80
+ | 13000 | 0.5253 | 71.5 | 266.0 | 0.5525 | 16.9418 | 59.026 | 7.378 | 56.5 | 116.0 |
81
+ | 14000 | 0.5657 | 71.0 | 252.0 | 0.5416 | 17.088 | 58.521 | 7.315 | 53.75 | 132.0 |
82
+ | 15000 | 0.6061 | 68.0 | 221.0 | 0.5283 | 16.9524 | 58.989 | 7.374 | 50.25 | 112.5 |
83
+ | 16000 | 0.6465 | 70.0 | 244.0 | 0.5200 | 17.0495 | 58.653 | 7.332 | 52.5 | 109.5 |
84
+ | 17000 | 0.6869 | 67.0 | 225.0 | 0.5097 | 17.0223 | 58.747 | 7.343 | 51.5 | 109.0 |
85
+ | 18000 | 0.7273 | 71.0 | 239.0 | 0.5016 | 17.0519 | 58.644 | 7.331 | 49.5 | 150.0 |
86
+ | 19000 | 0.7677 | 68.0 | 212.0 | 0.4887 | 17.0831 | 58.537 | 7.317 | 51.25 | 98.0 |
87
+ | 20000 | 0.8081 | 65.0 | 211.0 | 0.4865 | 17.0098 | 58.789 | 7.349 | 49.0 | 101.5 |
88
+ | 21000 | 0.8485 | 64.5 | 217.0 | 0.4791 | 17.0253 | 58.736 | 7.342 | 47.5 | 142.0 |
89
+ | 22000 | 0.8889 | 66.5 | 230.0 | 0.4798 | 16.9954 | 58.839 | 7.355 | 48.5 | 147.0 |
90
+ | 23000 | 0.9293 | 62.5 | 212.0 | 0.4675 | 16.9835 | 58.881 | 7.36 | 45.5 | 134.0 |
91
+ | 24000 | 0.9697 | 63.5 | 220.0 | 0.4712 | 16.9973 | 58.833 | 7.354 | 47.0 | 138.0 |
92
+ | 24750 | 1.0 | 63.75 | 247.0 | 0.4679 | 17.0597 | 58.618 | 7.327 | 45.75 | 205.0 |
93
 
94
  ### Framework versions
95
  - Distily 0.2.0
logs/copy_teacher_modules=_(_lm_head___True)_, hs_layer_mapper=last, hs_loss_fn=mse, hs_weight=1.0, learning_rate=0.0001, per_device_train_batch_size=4/events.out.tfevents.1724156440.02dbb11e2dcc ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:e1097da824e2c3d8de78fa43ea9a9d6095ba8a8f0527d4011fd4e466792212c4
3
+ size 312