AmharicGPT / tokenizer_config.json
dagim's picture
Training in progress, step 5000
a3e1200
raw
history blame
315 Bytes
{
"clean_up_tokenization_spaces": true,
"model_max_length": 1000000000000000019884624838656,
"special_tokens_map_file": "/root/.cache/huggingface/hub/models--dagim--amharic_tokenizer/snapshots/d4ff599bced963dc0b58ef4754ffb1e3c0f46da2/special_tokens_map.json",
"tokenizer_class": "PreTrainedTokenizerFast"
}