t5-small-awesome-text-to-sql-2024-11-10_13-40
This model is a fine-tuned version of cssupport/t5-small-awesome-text-to-sql on the arrow dataset. It achieves the following results on the evaluation set:
- Loss: 0.1505
- Gen Len: 19.0
- Bertscorer-p: 0.5983
- Bertscorer-r: 0.1002
- Bertscorer-f1: 0.3375
- Sacrebleu-score: 6.1735
- Sacrebleu-precisions: [92.82196987876635, 86.09309987961223, 81.16865589315682, 77.5936294965929]
- Bleu-bp: 0.0733
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0003
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Gen Len | Bertscorer-p | Bertscorer-r | Bertscorer-f1 | Sacrebleu-score | Sacrebleu-precisions | Bleu-bp |
---|---|---|---|---|---|---|---|---|---|---|
0.2655 | 1.0 | 4772 | 0.2099 | 19.0 | 0.5770 | 0.0864 | 0.3203 | 5.7173 | [91.0934769807022, 81.88030009989161, 75.59001146341751, 71.32247244849066] | 0.0718 |
0.1951 | 2.0 | 9544 | 0.1772 | 19.0 | 0.5695 | 0.0718 | 0.3090 | 5.7315 | [91.38097911302968, 82.52214039836731, 76.55664627495614, 73.06145893164847] | 0.0711 |
0.1609 | 3.0 | 14316 | 0.1628 | 19.0 | 0.5960 | 0.1033 | 0.3382 | 6.0737 | [92.32304047118862, 84.75338215740487, 79.32502315982035, 75.25860249102807] | 0.0735 |
0.1412 | 4.0 | 19088 | 0.1551 | 19.0 | 0.5925 | 0.0959 | 0.3326 | 6.0701 | [92.56176903043524, 85.09918369073299, 79.79597353297214, 76.12497023888257] | 0.0730 |
0.1191 | 5.0 | 23860 | 0.1512 | 19.0 | 0.5905 | 0.0928 | 0.3300 | 6.0937 | [92.29263048778147, 84.9906547977318, 79.83711978971085, 76.22241882452364] | 0.0733 |
0.1063 | 6.0 | 28632 | 0.1486 | 19.0 | 0.5959 | 0.0986 | 0.3356 | 6.1128 | [92.67271190348113, 85.5578689269597, 80.37916696032137, 76.71086200742904] | 0.0731 |
0.094 | 7.0 | 33404 | 0.1489 | 19.0 | 0.5984 | 0.1024 | 0.3388 | 6.1770 | [92.60841659561831, 85.6159908960634, 80.52775143703391, 76.7429609924408] | 0.0738 |
0.0875 | 8.0 | 38176 | 0.1496 | 19.0 | 0.5960 | 0.0976 | 0.3351 | 6.1421 | [92.6290822842547, 85.75971432797346, 80.81931219105543, 77.24221764177369] | 0.0732 |
0.0841 | 9.0 | 42948 | 0.1498 | 19.0 | 0.6019 | 0.1059 | 0.3424 | 6.2261 | [92.84100049795074, 86.14431816984929, 81.20480235905357, 77.4564647967041] | 0.0739 |
0.0777 | 10.0 | 47720 | 0.1505 | 19.0 | 0.5983 | 0.1002 | 0.3375 | 6.1735 | [92.82196987876635, 86.09309987961223, 81.16865589315682, 77.5936294965929] | 0.0733 |
Framework versions
- Transformers 4.45.1
- Pytorch 2.4.0
- Datasets 3.0.1
- Tokenizers 0.20.0
- Downloads last month
- 44
Inference Providers
NEW
This model is not currently available via any of the supported third-party Inference Providers, and
the model is not deployed on the HF Inference API.
Model tree for PopularPenguin/t5-small-awesome-text-to-sql-2024-11-10_13-40
Base model
cssupport/t5-small-awesome-text-to-sql