flash-linear-attention

Fast linear attention models and layers

Installation

In a virtualenv (see these instructions if you need to create one):

pip3 install flash-linear-attention

Dependencies

Releases

Version Released Bullseye
Python 3.9
Bookworm
Python 3.11
Trixie
Python 3.13
Files
0.4.2 2026-03-12      
0.4.1 2025-12-24      
0.4.0 2025-10-27      
0.3.2 2025-09-10      
0.3.1 2025-08-26      
0.3.0 2025-07-14      
0.2.2 2025-06-05      
0.2.1 2025-04-23      
0.2.0 2025-04-11      
0.1.2 2025-03-31      
0.1.1 2025-03-24      
0.1 2025-03-20      

Issues with this package?

Page last updated 2026-03-13 13:05:00 UTC