C
RWKV-LM
Author: BlinkDL
RWKV (pronounced RwaKuv) is an RNN with great LLM performance, which can also be directly trained like a GPT transformer (parallelizable). We are at RWKV-7 "Goose". So it's combining the best of RNN and transformer - great performance, linear time, constant space (no kv-cache), fast training, infinite ctx_len, and free sentence embedding.
Source: BlinkDL/RWKV-LM
C · Review first
Author unclaimed
Clear source
Execution · High
Audit focus · unexpected code execution
Universal14.5K