Text Generation
GGUF
English
MOE
Mixture of Experts
2X8B
deepseek
reasoning
thinking
creative
creative writing
128k context
general usage
problem solving
brainstorming
solve riddles
fiction writing
plot generation
sub-plot generation
story generation
scene continue
storytelling
fiction story
story
writing
fiction
roleplaying
llama 3.1
mergekit
Inference Endpoints
conversational
Update README.md
Browse files
README.md
CHANGED
@@ -33,8 +33,6 @@ tags:
|
|
33 |
pipeline_tag: text-generation
|
34 |
---
|
35 |
|
36 |
-
(quants uploading, 3 examples below.)
|
37 |
-
|
38 |
<H2>L3.1-MOE-2X8B-Deepseek-DeepHermes-e32-13.7B-gguf</H2>
|
39 |
|
40 |
<img src="two-gods2.jpg" style="float:right; width:300px; height:300px; padding:5px;">
|
|
|
33 |
pipeline_tag: text-generation
|
34 |
---
|
35 |
|
|
|
|
|
36 |
<H2>L3.1-MOE-2X8B-Deepseek-DeepHermes-e32-13.7B-gguf</H2>
|
37 |
|
38 |
<img src="two-gods2.jpg" style="float:right; width:300px; height:300px; padding:5px;">
|