File size: 353 Bytes
8710793 |
1 2 3 4 5 6 7 8 9 10 11 |
---
license: apache-2.0
datasets:
- cerebras/SlimPajama-627B
language:
- en
---
Model of the paper [MoM: Linear Sequence Modeling with Mixture-of-Memories](https://arxiv.org/abs/2502.13685) and [Gated Delta Networks: Improving Mamba2 with Delta Rule](https://arxiv.org/abs/2412.06464).
The model was trained on a sample of SlimPajama with 100B tokens. |