File size: 862 Bytes
3193f83
 
 
 
 
 
a635541
3193f83
a5f968a
 
a635541
3193f83
4a4aecf
3193f83
a635541
3193f83
a635541
3193f83
a635541
3193f83
a635541
3193f83
a5f968a
3193f83
a635541
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
---
library_name: transformers
tags:
- llama-factory
---

# Model Card for Horny Stheno

**This was an experiment to see if aligning other models via LORA is possible. Yes it is. We aligned it to be always horny.**

We took V3.3 Stheno weights from [here](https://huggingface.co./Sao10K/L3-8B-Stheno-v3.3-32K)

And applied our [lora](https://huggingface.co./nothingiisreal/llama3-8B-DWP-lora) at Alpha = 768

Thank you to Sao10K for the amazing model.

This is not legal advice. I don't put any extra licensing on my own lora.

LLaMA 3 license may conflict with Creative Commons Attribution Non Commercial 4.0.

LLaMA 3 license can be found [here](https://llama.meta.com/llama3/license/)

If you want to host a model using our lora, you have our permission, but you might consider getting Sao's permission if you want to host their model.

Again, not legal advice.