indic-gpt / tokenizer_config.json
aashay96's picture
Training in progress, step 1
5e68ac3
raw
history blame contribute delete
170 Bytes
{
"bos_token": "<|endoftext|>",
"eos_token": "<|endoftext|>",
"model_max_length": 1000000000000000019884624838656,
"tokenizer_class": "PreTrainedTokenizerFast"
}