--- datasets: - lemonilia/LimaRP library_name: transformers tags: - roleplay - text-generation-inference --- # Model Card for Model ID My first exl2 quant of my favourite go-to roleplaying model. Can fit into my empty 24GB VRAM with 32k context in 8-bit cache. Might do a 4.25bpw quant later. Original model: https://huggingface.co./Doctor-Shotgun/Nous-Capybara-limarpv3-34B Prompt format: https://github.com/tatsu-lab/stanford_alpaca