view reply i've been trying this script with some models, with llama3.2:3b didn't work, just responded the same