Text Generation
Transformers
PyTorch
llama
text-generation-inference
Inference Endpoints
Edit model card

Merge of EverythingLM-V3-13b QLoRa and OpenOrca-Platypus2-13B.

Prompt format:

USER: <prompt>
ASSISTANT:

Quants:

https://huggingface.co./TheBloke/PuddleJumper-13B-V2-GGUF

https://huggingface.co./TheBloke/PuddleJumper-13B-V2-AWQ

https://huggingface.co./TheBloke/PuddleJumper-13B-V2-GPTQ

Open LLM Leaderboard Evaluation Results

Detailed results can be found here

Metric Value
Avg. 49.69
ARC (25-shot) 57.0
HellaSwag (10-shot) 81.06
MMLU (5-shot) 58.3
TruthfulQA (0-shot) 52.66
Winogrande (5-shot) 72.45
GSM8K (5-shot) 3.64
DROP (3-shot) 22.74
Downloads last month
685
Inference Examples
Inference API (serverless) is not available, repository is disabled.

Model tree for totally-not-an-llm/PuddleJumper-13b-V2

Quantizations
4 models

Datasets used to train totally-not-an-llm/PuddleJumper-13b-V2