Edit model card

Model Summary

recipe-gantt is a monotask language model to translate recipes into gantt charts in a TSV format.

It is a QLoRA finetune of Mistral-7B-v0.1 on the pocasrocas/recipe-gantt dataset. I then used llama.cpp to convert it to .gguf format for fast local inference.

Intended use

To create recipe gantt charts and recipe gantt charts only 🧑‍🍳

It's released here to accompany the recipe-gantt tool.

Input format

The model was trained with the alpaca instruction/input/response prompt format. Exact details on the expected input can be inferred from the inference code here.

Limitations

  • the model will inherit any limitations of the parent model (Mistral-7B-v0.1)
  • it was finetuned on a tiny synthetic dataset of only 288 examples so sometimes produces corrupted TSV files and populates cells inaccurately.

Training

  • QLoRA finetune using axolotl
  • ~1hr on NVIDIA GeForce RTX 3090 Ti (wandb)
  • Training code here
Downloads last month
21
GGUF
Model size
7.24B params
Architecture
llama

4-bit

Inference Examples
Unable to determine this model's library. Check the docs .

Dataset used to train pocasrocas/recipe-gantt-v0.1