jacobfulano
commited on
Commit
•
2435297
1
Parent(s):
1bcf6fd
Update README.md
Browse files
README.md
CHANGED
@@ -64,7 +64,7 @@ Apache-2.0 (commercial use permitted)
|
|
64 |
|
65 |
* [Blog post: Introducing MPT-7B: A New Standard for Open-Source, Commercially Usable LLMs](https://www.mosaicml.com/blog/mpt-7b)
|
66 |
* [Codebase (mosaicml/llm-foundry repo)](https://github.com/mosaicml/llm-foundry/)
|
67 |
-
* Questions: Feel free to contact us via the [MosaicML Community Slack](https://join.slack.com/t/mosaicml-community/shared_invite/zt-
|
68 |
|
69 |
|
70 |
## How to Use
|
@@ -178,6 +178,10 @@ MPT-7B was trained on various public datasets.
|
|
178 |
While great efforts have been taken to clean the pretraining data, it is possible that this model could generate lewd, biased or otherwise offensive outputs.
|
179 |
|
180 |
|
|
|
|
|
|
|
|
|
181 |
## Citation
|
182 |
|
183 |
Please cite this model using the following format:
|
|
|
64 |
|
65 |
* [Blog post: Introducing MPT-7B: A New Standard for Open-Source, Commercially Usable LLMs](https://www.mosaicml.com/blog/mpt-7b)
|
66 |
* [Codebase (mosaicml/llm-foundry repo)](https://github.com/mosaicml/llm-foundry/)
|
67 |
+
* Questions: Feel free to contact us via the [MosaicML Community Slack](https://join.slack.com/t/mosaicml-community/shared_invite/zt-1btms90mc-GipE2ufuPkKY0QBrmF3LSA)!
|
68 |
|
69 |
|
70 |
## How to Use
|
|
|
178 |
While great efforts have been taken to clean the pretraining data, it is possible that this model could generate lewd, biased or otherwise offensive outputs.
|
179 |
|
180 |
|
181 |
+
## MosaicML Platform
|
182 |
+
|
183 |
+
If you're interested in [training](https://www.mosaicml.com/training) and [deploying](https://www.mosaicml.com/inference) your own MPT or LLMs on the MosaicML Platform, [sign up here](https://forms.mosaicml.com/demo).
|
184 |
+
|
185 |
## Citation
|
186 |
|
187 |
Please cite this model using the following format:
|