WOW ..nice
#1
by
mirek190
- opened
Third such a BIG model for home PC
llama 65B , alpaca-lora 65B , and dromedary 65B NOW.
Have to test it ;)
Thanks for ggml.
I can say IS MUCH SMARTER.
Tested dromedary-65B-lora-GGML q5_1
Is able to sole this 2Y-12=-16
Alpaca-lora 65B can not do that.
solve this equation and explain each step 2Y-12=-16
The first step is to add 12 to both sides of the equation.
2Y-12+12= -16+12
2Y = -4
Then, we divide both sides by 2 to get Y.
Y = -4/2
Y = -2
Unfortunately is censored ....alpaca-lora 65B does't
Yes, there is a model made based on LLaMa 65B that scores 103% chatgpt in GPT4 eval https://github.com/ZrrSkywalker/LLaMA-Adapter/tree/main/llama_adapter_v2_chat65b
So you see the power of these 65B models family