Context Length is only 4k
#2
by
LINAI
- opened
Hi,
We have installed this model on our own servers and we have noticed that the context length is only 4k. Is there any settings we can apply to have the full 128k context lenght?
@LINAI You set your context length yourself in whatever inference method you run. You can view the config file: "max_position_embeddings": 131072 , which is the same as the original 3.1 instruct.
Orenguteng
changed discussion status to
closed
Thank you for your help. It was being restricted by the default value set from TGI.