13b version
#2
by
rombodawg
- opened
Do you have plans for making a 13b version with your dataset so it can run on more modest hardware?
and even a 7b version tbh
the higher param models may also be more accessible after quantization
True but even in 4-bit the 34b models cant be fully loaded on a 10gb gpu like the 3080. Which is what i have, and alot of other people. Most people dont own a 3090
I agree, I only have a 2070. Ideally Phind would host the model on Huggingface.
Yes, please make a 13b and a 7b model