louisbrulenaudet
commited on
Update README.md
Browse files
README.md
CHANGED
@@ -23,9 +23,9 @@ library_name: transformers
|
|
23 |
|
24 |
<center><img src='https://i.imgur.com/0xFTuAX.png' width='450px'></center>
|
25 |
|
26 |
-
# Pearl-3x7B, an xtraordinary
|
27 |
|
28 |
-
Pearl-3x7B is a
|
29 |
* [dvilasuero/DistilabelBeagle14-7B](https://huggingface.co/dvilasuero/DistilabelBeagle14-7B)
|
30 |
* [beowolx/CodeNinja-1.0-OpenChat-7B](https://huggingface.co/beowolx/CodeNinja-1.0-OpenChat-7B)
|
31 |
* [WizardLM/WizardMath-7B-V1.1](https://huggingface.co/WizardLM/WizardMath-7B-V1.1)
|
@@ -134,7 +134,7 @@ If you use this code in your research, please use the following BibTeX entry.
|
|
134 |
```BibTeX
|
135 |
@misc{louisbrulenaudet2023,
|
136 |
author = {Louis Brulé Naudet},
|
137 |
-
title = {Pearl-3x7B, an xtraordinary
|
138 |
year = {2023}
|
139 |
howpublished = {\url{https://huggingface.co/louisbrulenaudet/Pearl-3x7B}},
|
140 |
}
|
|
|
23 |
|
24 |
<center><img src='https://i.imgur.com/0xFTuAX.png' width='450px'></center>
|
25 |
|
26 |
+
# Pearl-3x7B, an xtraordinary Mixture of Experts (MoE) for data science
|
27 |
|
28 |
+
Pearl-3x7B is a Mixture of Experts (MoE) made with the following models :
|
29 |
* [dvilasuero/DistilabelBeagle14-7B](https://huggingface.co/dvilasuero/DistilabelBeagle14-7B)
|
30 |
* [beowolx/CodeNinja-1.0-OpenChat-7B](https://huggingface.co/beowolx/CodeNinja-1.0-OpenChat-7B)
|
31 |
* [WizardLM/WizardMath-7B-V1.1](https://huggingface.co/WizardLM/WizardMath-7B-V1.1)
|
|
|
134 |
```BibTeX
|
135 |
@misc{louisbrulenaudet2023,
|
136 |
author = {Louis Brulé Naudet},
|
137 |
+
title = {Pearl-3x7B, an xtraordinary Mixture of Experts (MoE) for data science},
|
138 |
year = {2023}
|
139 |
howpublished = {\url{https://huggingface.co/louisbrulenaudet/Pearl-3x7B}},
|
140 |
}
|