updated model_max_length from to 1000000000000000019884624838656 to 32768
#1
by
LHC88
- opened
the same bug is in the original tokenizer config by @mistral-ai
shimmyshimmer
changed pull request status to
merged