ERROR:Failed to load the model. TypeError: not a string
#3
by
AIJUUD
- opened
If you're using oobabooga webui,
check this out: https://huggingface.co./beomi/llama-2-ko-7b#note-for-oobaboogatext-generation-webui
Llama-2-Ko uses FastTokenizer, and webui uses slow tokenizer as default, and that causes the issue.
beomi
changed discussion status to
closed