Text Generation
Transformers
Safetensors
Korean
llama
text-generation-inference
Inference Endpoints
Yi-Ko-6B-Instruct-v1.0 / special_tokens_map.json
wkshin89's picture
Upload tokenizer
8ba612d
raw
history blame
573 Bytes
{
"bos_token": {
"content": "<|startoftext|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false
},
"eos_token": {
"content": "<|endoftext|>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false
},
"pad_token": {
"content": "<unk>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false
},
"unk_token": {
"content": "<unk>",
"lstrip": false,
"normalized": false,
"rstrip": false,
"single_word": false
}
}