Update README.md
Browse files
README.md
CHANGED
@@ -31,7 +31,7 @@ library_name: transformers
|
|
31 |
|
32 |
**ALL YOU NEED IS RWKV**
|
33 |
|
34 |
-
This is an **early preview** of our 7B parameter
|
35 |
|
36 |
- ✅ RWKV-7's efficient recurrence mechanism
|
37 |
- ✅ No self-attention, fully O(n)
|
|
|
31 |
|
32 |
**ALL YOU NEED IS RWKV**
|
33 |
|
34 |
+
This is an **early preview** of our 7B parameter RNN-based model, trained on 2k context length **(only stage-2 applied, without SFT or DPO)** through 3-stage knowledge distillation from DeepSeek-R1-Distill-Qwen-1.5B. While being a foundational version, it demonstrates:
|
35 |
|
36 |
- ✅ RWKV-7's efficient recurrence mechanism
|
37 |
- ✅ No self-attention, fully O(n)
|