Raven (RKWV) as a potential LLM for Pygmalion to use.
RWKV is an RNN with transformer-level LLM performance. It can be directly trained like a GPT (parallelizable). So it's combining the best of RNN and transformers - great performance, fast inference, saves VRAM, fast training, "infinite" ctx_len, and free sentence embedding. And, it's 100% attention-free (You only need the hidden state at position t to compute the state at position t+1 - you can use the "GPT" mode to quickly compute the hidden state for the "RNN" mode.).
It looks promising.
Check it out:
https://github.com/BlinkDL/RWKV-LM
https://huggingface.co./spaces/BlinkDL/Raven-RWKV-7B
https://huggingface.co./spaces/BlinkDL/ChatRWKV-gradio
Discord: https://discord.gg/bDSBUMeFpc
Sepp Hochreiter, a pioneer in Deep Learning who is known for vanishing gradient and LSTM, had this to say about Raven (RKWV):
Github github.com/BlinkDL/RWKV-LM: RNN with transformer-level performance, without using attention. Similar to Apple's Attention Free Transformer. All trained models open-source. Inference is very fast (even on CPUs) and might work on cell phones.
https://twitter.com/hochreitersepp/status/1524270961314484227?s=46&t=KC7cX_tVezEZLb2ntKap9g
User feedback from Raven (RKWV) GitHub page:
I've so far toyed around the character-based model on our relatively small pre-training dataset (around 10GB of text), and the results are extremely good - similar ppl to models taking much, much longer to train.
dear god rwkv is fast. i switched to another tab after starting training it from scratch & when i returned it was emitting plausible english & maori words, i left to go microwave some coffee & when i came back it was producing fully grammatically correct sentences.
Hey @Joseph717171 could you contact us at aicomp#7175 to discuss this approach further as we are pursuing it currently.