it-5.3-fp16-32k / generation_config.json
apsys's picture
Upload LlamaForCausalLM
094dcaf verified
raw
history blame contribute delete
232 Bytes
{
"bos_token_id": 1,
"do_sample": true,
"eos_token_id": 79097,
"max_new_tokens": 1024,
"num_beams": 3,
"pad_token_id": 2,
"temperature": 0.2,
"top_p": 0.98,
"transformers_version": "4.41.1",
"use_cache": false
}