Replit code

#134
by joaquinito2073 - opened

Hi, these models require custom code to be run, which I don't support at the moment (and it likely wouldn't work with llama.cpp anyway). Very sorry.

mradermacher changed discussion status to closed

Sign up or log in to comment