No think tokens visible
#15
by
sudkamath
- opened
Hey, thanks a lot for the quantized version!
I noticed that I don't observe any think tokens but I see only the final answer. I run llama.cpp python server. Could you tell me what needs to be done?
Thanks
Hey, thanks a lot for the quantized version!
I noticed that I don't observe any think tokens but I see only the final answer. I run llama.cpp python server. Could you tell me what needs to be done?
Thanks
thast very strange, unfortunately, im not exactly sure but you could ask in the llama.cpp github issues maybe
My guess is
@sudkamath
is viewing the output in a MarkDown rendered viewport... as <
and >
are not valid markdown they disappear...
Look at the raw output...