Edit model card

Jokestral

This model was created by fine-tuning unsloth/mistral-7b-v0.3-bnb-4bit on Short jokes dataset. So the only purpose of this model is the generation of cringe jokes.
Just write the first few words and get your joke.

Usage

Goodle Colab example

pip install transformers
pip install --no-deps "trl<0.9.0" peft accelerate bitsandbytes
from transformers import AutoTokenizer,AutoModelForCausalLM
model = AutoModelForCausalLM.from_pretrained("SantaBot/Jokestral_16bit",)
tokenizer = AutoTokenizer.from_pretrained("SantaBot/Jokestral_16bit")
inputs = tokenizer(
[
    "My doctor" # YOUR PROMPT HERE
], return_tensors = "pt").to("cuda")
outputs = model.generate(**inputs, max_new_tokens = 64, use_cache = True)
tokenizer.batch_decode(outputs)

The output should be something like :
['<s> My doctor told me I have to stop m4sturb4t1ng. I asked him why and he said ""Because I\'m trying to examine you.""\n</s>']

Downloads last month
8
Safetensors
Model size
7.25B params
Tensor type
FP16
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for SantaBot/Jokestral_16bit

Finetuned
(301)
this model

Space using SantaBot/Jokestral_16bit 1