Text Generation
Transformers
PyTorch
llama
text-generation-inference
Inference Endpoints
File size: 1,096 Bytes
235d265
 
c41054a
 
 
 
235d265
c41054a
 
 
 
 
 
 
b1d2bad
 
 
 
1fe9494
b1d2bad
1fe9494
2db3366
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
---
license: other
datasets:
- totally-not-an-llm/EverythingLM-data-V3
- Open-Orca/OpenOrca
- garage-bAInd/Open-Platypus
---

Merge of EverythingLM-V3-13b QLoRa and OpenOrca-Platypus2-13B.

### Prompt format:
```
USER: <prompt>
ASSISTANT:
```

### Quants:
https://huggingface.co./TheBloke/PuddleJumper-13B-V2-GGUF

https://huggingface.co./TheBloke/PuddleJumper-13B-V2-AWQ

https://huggingface.co./TheBloke/PuddleJumper-13B-V2-GPTQ
# [Open LLM Leaderboard Evaluation Results](https://huggingface.co./spaces/HuggingFaceH4/open_llm_leaderboard)
Detailed results can be found [here](https://huggingface.co./datasets/open-llm-leaderboard/details_totally-not-an-llm__PuddleJumper-13b-V2)

| Metric                | Value                     |
|-----------------------|---------------------------|
| Avg.                  | 49.69   |
| ARC (25-shot)         | 57.0          |
| HellaSwag (10-shot)   | 81.06    |
| MMLU (5-shot)         | 58.3         |
| TruthfulQA (0-shot)   | 52.66   |
| Winogrande (5-shot)   | 72.45   |
| GSM8K (5-shot)        | 3.64        |
| DROP (3-shot)         | 22.74         |