GPT2-large-GOTfinetuned_v3 / special_tokens_map.json
huangtuoyue's picture
Upload tokenizer
5d54140
raw
history blame contribute delete
196 Bytes
{
"bos_token": "[BOS]",
"eos_token": "[EOS]",
"unk_token": {
"content": "<|endoftext|>",
"lstrip": false,
"normalized": true,
"rstrip": false,
"single_word": false
}
}