Update README.md
Browse files
README.md
CHANGED
@@ -15,7 +15,7 @@ widget:
|
|
15 |
|
16 |
# Kanarya-2B: Turkish Language Model
|
17 |
|
18 |
-
<img src="https://asafaya.me/images/kanarya.webp" alt="Kanarya Logo" style="width:
|
19 |
|
20 |
**Kanarya** is a pre-trained Turkish GPT-J 2B model. Released as part of [Turkish Data Depository](https://tdd.ai/) efforts, the Kanarya family has two versions (Kanarya-2B, Kanarya-0.7B). Kanarya-2B is the larger version and Kanarya-0.7B is the smaller version. Both models are trained on a large-scale Turkish text corpus, filtered from OSCAR and mC4 datasets. The training data is collected from various sources, including news, articles, and websites, to create a diverse and high-quality dataset. The models are trained using a JAX/Flax implementation of the [GPT-J](https://github.com/kingoflolz/mesh-transformer-jax) architecture.
|
21 |
|
|
|
15 |
|
16 |
# Kanarya-2B: Turkish Language Model
|
17 |
|
18 |
+
<img src="https://asafaya.me/images/kanarya.webp" alt="Kanarya Logo" style="width:600px;"/>
|
19 |
|
20 |
**Kanarya** is a pre-trained Turkish GPT-J 2B model. Released as part of [Turkish Data Depository](https://tdd.ai/) efforts, the Kanarya family has two versions (Kanarya-2B, Kanarya-0.7B). Kanarya-2B is the larger version and Kanarya-0.7B is the smaller version. Both models are trained on a large-scale Turkish text corpus, filtered from OSCAR and mC4 datasets. The training data is collected from various sources, including news, articles, and websites, to create a diverse and high-quality dataset. The models are trained using a JAX/Flax implementation of the [GPT-J](https://github.com/kingoflolz/mesh-transformer-jax) architecture.
|
21 |
|