DeBERTa-ST-AllLayers-v3.1 / sentence_bert_config.json
bobox's picture
KL divergence loss layers selfdistill....Multi step multi task training.
a232ba1 verified
raw
history blame contribute delete
53 Bytes
{
"max_seq_length": 512,
"do_lower_case": false
}