update: model description
Browse files
README.md
CHANGED
@@ -13,6 +13,8 @@ library_name: transformers
|
|
13 |
## Model Description
|
14 |
PLaMo 2 1B is a 1B model pre-trained on English and Japanese datasets, developed by Preferred Elements, Inc.
|
15 |
|
|
|
|
|
16 |
PLaMo 2 1B is released under Apache License version 2.0.
|
17 |
|
18 |
**NOTE**: This model has **NOT** been instruction-tuned for chat dialog or other downstream tasks.
|
|
|
13 |
## Model Description
|
14 |
PLaMo 2 1B is a 1B model pre-trained on English and Japanese datasets, developed by Preferred Elements, Inc.
|
15 |
|
16 |
+
PLaMo 2 models adapt the [Samba](https://arxiv.org/abs/2406.07522) architecture rather than the Transformer architecture. Samba integrates [Mamba](https://arxiv.org/abs/2312.00752), a selective State Space Model (SSM), with sliding window attention, combining their strengths for improved efficiency and performance.
|
17 |
+
|
18 |
PLaMo 2 1B is released under Apache License version 2.0.
|
19 |
|
20 |
**NOTE**: This model has **NOT** been instruction-tuned for chat dialog or other downstream tasks.
|