Update max_position_embeddings to 128k context size instead of 32k

#11
Mistral AI_ org

From 32768 to 131072, @patrickvonplaten could you take a look just to be sure? Thanks!!

timlacroix changed pull request status to merged

Sign up or log in to comment