lez-src / tokenizer_config.json
michelleyunun's picture
Upload tokenizer
cf4487d
raw
history blame
194 Bytes
{
"clean_up_tokenization_spaces": true,
"eos_token": "<end>",
"model_max_length": 1000000000000000019884624838656,
"pad_token": "<pad>",
"tokenizer_class": "PreTrainedTokenizerFast"
}