JingweiZuo
commited on
Update README.md
Browse files
README.md
CHANGED
@@ -30,7 +30,7 @@ Falcon3-Mamba-7B-Base supports a context length up to 32K and was mainly trained
|
|
30 |
- state dimension: 16
|
31 |
- 32k context length
|
32 |
- 65k vocab size
|
33 |
-
- Continue Pretrained from Falcon
|
34 |
- Postrained on 1.2 million samples of STEM, conversations, code, and safety.
|
35 |
- Developed by [Technology Innovation Institute](https://www.tii.ae)
|
36 |
- License: TII Falcon-LLM License 2.0
|
|
|
30 |
- state dimension: 16
|
31 |
- 32k context length
|
32 |
- 65k vocab size
|
33 |
+
- Continue Pretrained from [Falcon-Mamba-7b](https://arxiv.org/abs/2410.05355), with another 1500 Gigatokens of data consisting of web, code, STEM and high quality data.
|
34 |
- Postrained on 1.2 million samples of STEM, conversations, code, and safety.
|
35 |
- Developed by [Technology Innovation Institute](https://www.tii.ae)
|
36 |
- License: TII Falcon-LLM License 2.0
|