File size: 776 Bytes
d292ae7 322c811 5d292ba |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 |
## Fresh Alpasta, done Al Dente!
It's da *logical* choice! Now with a similar personality emulation quality to [GPT4-X-Alpasta-30b!](https://huggingface.co./MetaIX/GPT4-X-Alpasta-30b)
## Model Info:
ChanSung's [Alpaca-LoRA-30B-elina](https://huggingface.co./LLMs/Alpaca-LoRA-30B-elina) merged with [Open Assistant's second Finetune](https://huggingface.co./OpenAssistant/oasst-sft-7-llama-30b-xor)
## Benchmarks:
**Wikitext2:** 4.662261962890625
**PTB:** 24.547462463378906
**C4:** 7.05504846572876
[4bit](https://huggingface.co./Aeala/GPT4-x-AlpacaDente2-30b/blob/main/4bit.safetensors):
**Wikitext2:** 5.016242980957031
**PTB:** 25.576189041137695
**C4:** 7.332120418548584
~ Thanks to [askmyteapot](https://huggingface.co./askmyteapot) for performing these benchmarks! |