Low Context Lenght.
#8
by
avneetsingh
- opened
We are moving from 128K to 4K context length for a bigger footprint model ? Should i assume that this model should work for RAGs where we have less context ? I will update this thread as the biggest issue for me is low context length.
Model card -> Summary -> 3rd paragraph
@avneetsingh We're working on it. Thanks!
hunkim
changed discussion status to
closed