Text Generation
Transformers
PyTorch
English
llama
text-generation-inference
Inference Endpoints

Context length

#3
by MoaazZakyBq - opened

What is the maximum context length for this model?

OpenOrca org

The maximum context is 2048 tokens.

Sign up or log in to comment