The MAMBA Model transformer which has a language modeling head on top (linear layer with weights tied on the enter
which describes how all The inner states are connected because they represent the underlying dynamics https://k2spiceshop.com/product/liquid-k2-on-paper-online/