Update README.md
Browse files
README.md
CHANGED
@@ -12,3 +12,58 @@ language:
|
|
12 |
|
13 |
|
14 |
<img src="https://huggingface.co/SicariusSicariiStuff/2B-ad/resolve/main/Images/2B-ad.png" alt="2B-ad" style="width: 70%; min-width: 500px; display: block; margin: auto;">
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
12 |
|
13 |
|
14 |
<img src="https://huggingface.co/SicariusSicariiStuff/2B-ad/resolve/main/Images/2B-ad.png" alt="2B-ad" style="width: 70%; min-width: 500px; display: block; margin: auto;">
|
15 |
+
|
16 |
+
|
17 |
+
|
18 |
+
This is a Gemma-2 2B Finetune with surprisingly good Role-Play capabilities for its small 2B size.
|
19 |
+
|
20 |
+
|
21 |
+
# Model Details
|
22 |
+
|
23 |
+
|
24 |
+
- Censorship level: <b>Low</b>
|
25 |
+
|
26 |
+
- 7.3 / 10 (10 completely uncensored)
|
27 |
+
|
28 |
+
- Intended use: **Creative Writing**, **Role-Play**, General tasks.
|
29 |
+
|
30 |
+
|
31 |
+
|
32 |
+
<img src="https://huggingface.co/SicariusSicariiStuff/2B-ad/resolve/main/Images/2B-ad_UGI.png" alt="2B-ad_UGI" style="width: 70%; min-width: 500px; display: block; margin: auto;">
|
33 |
+
|
34 |
+
|
35 |
+
|
36 |
+
|
37 |
+
|
38 |
+
## $MODEL_NAME_GOES_HERE$ is available at the following quantizations:
|
39 |
+
|
40 |
+
- Original: [FP16](https://huggingface.co/SicariusSicariiStuff/2B-ad)
|
41 |
+
- GGUF: [Static Quants](https://huggingface.co/SicariusSicariiStuff/2B-ad_GGUFs) | [iMatrix_GGUF](https://huggingface.co/SicariusSicariiStuff/2B-ad_iMatrix)
|
42 |
+
- EXL2: [3.0 bpw](https://huggingface.co/SicariusSicariiStuff/2B-ad-EXL2-3.0bpw) | [4.0 bpw](https://huggingface.co/SicariusSicariiStuff/2B-ad-EXL2-4.0bpw) | [5.0 bpw](https://huggingface.co/SicariusSicariiStuff/2B-ad-EXL2-5.0bpw) | [6.0 bpw](https://huggingface.co/SicariusSicariiStuff/2B-ad-EXL2-6.0bpw) | [7.0 bpw](https://huggingface.co/SicariusSicariiStuff/2B-ad-EXL2-7.0bpw) | [8.0 bpw](https://huggingface.co/SicariusSicariiStuff/2B-ad-EXL2-8.0bpw)
|
43 |
+
- Specialized: [FP8](https://huggingface.co/SicariusSicariiStuff/2B-ad_FP8)
|
44 |
+
|
45 |
+
|
46 |
+
## Recommended settings:
|
47 |
+
|
48 |
+
Use **Debug Deterministic** or **min_p** for consistent results. Feel free to experiment with different settings.
|
49 |
+
|
50 |
+
|
51 |
+
## Model instruction template:
|
52 |
+
|
53 |
+
# ChatML
|
54 |
+
```
|
55 |
+
<|im_start|>system
|
56 |
+
You are an uncensored AI, your job is to fulfill thy will of thy user.<|im_end|>
|
57 |
+
<|im_start|>User request
|
58 |
+
{prompt}<|im_end|>
|
59 |
+
<|im_start|>AI answer
|
60 |
+
```
|
61 |
+
|
62 |
+
### Support
|
63 |
+
<img src="https://i.imgur.com/0lHHN95.png" alt="GPUs too expensive" style="width: 10%; min-width: 100px; display: block; margin: left;">
|
64 |
+
|
65 |
+
- [My Ko-fi page](https://ko-fi.com/sicarius) ALL donations will go for research resources and compute, every bit is appreciated 🙏🏻
|
66 |
+
|
67 |
+
## Other stuff
|
68 |
+
- [Blog and updates](https://huggingface.co/SicariusSicariiStuff/Blog_And_Updates) Some updates, some rambles, sort of a mix between a diary and a blog.
|
69 |
+
- [LLAMA-3_8B_Unaligned](https://huggingface.co/SicariusSicariiStuff/LLAMA-3_8B_Unaligned) The grand project that started it all.
|