ve-forbryderne
commited on
Commit
•
cb7ae02
1
Parent(s):
24a7465
Remove incorrect link from README.md
Browse files
README.md
CHANGED
@@ -9,7 +9,7 @@ inference: false
|
|
9 |
This is the second generation of the original Shinen made by Mr. Seeker. The full dataset consists of 6 different sources, all surrounding the "Adult" theme. The name "Erebus" comes from the greek mythology, also named "darkness". This is in line with Shin'en, or "deep abyss". For inquiries, please contact the KoboldAI community. **Warning: THIS model is NOT suitable for use by minors. The model will output X-rated content.**
|
10 |
|
11 |
## Training procedure
|
12 |
-
GPT-NeoX-20B-Erebus was trained on a TPUv3-256 TPU pod using a heavily modified version of Ben Wang's Mesh Transformer JAX library, the original version of which was used by EleutherAI to train their GPT-J-6B model.
|
13 |
|
14 |
## Training data
|
15 |
The data can be divided in 6 different datasets:
|
|
|
9 |
This is the second generation of the original Shinen made by Mr. Seeker. The full dataset consists of 6 different sources, all surrounding the "Adult" theme. The name "Erebus" comes from the greek mythology, also named "darkness". This is in line with Shin'en, or "deep abyss". For inquiries, please contact the KoboldAI community. **Warning: THIS model is NOT suitable for use by minors. The model will output X-rated content.**
|
10 |
|
11 |
## Training procedure
|
12 |
+
GPT-NeoX-20B-Erebus was trained on a TPUv3-256 TPU pod using a heavily modified version of Ben Wang's Mesh Transformer JAX library, the original version of which was used by EleutherAI to train their GPT-J-6B model.
|
13 |
|
14 |
## Training data
|
15 |
The data can be divided in 6 different datasets:
|