Skip to content

Pull requests: Dao-AILab/flash-attention

Author
Filter by author
Loading
Label
Filter by label
Loading
Use alt + click/return to exclude labels
or + click/return for logical OR
Projects
Filter by project
Loading
Milestones
Filter by milestone
Loading
Reviews
Assignee
Filter by who’s assigned
Assigned to nobody Loading
Sort

Pull requests list

Nicer headdim error message
#2227 opened Feb 4, 2026 by drisspg Loading…
[WIP] varlen blocksparsity
#2224 opened Feb 2, 2026 by reubenconducts Draft
[FA3] Mark current main version as v3.0.0 stable
#2223 opened Feb 2, 2026 by lw Loading…
[Ai-assisted] CLC work stealing
#2218 opened Jan 31, 2026 by drisspg Loading…
[CUTE]Bump to Cutedsl
#2216 opened Jan 29, 2026 by drisspg Draft
Add loc info & Fix api changes for CuTeDSL 4.4
#2204 opened Jan 23, 2026 by keithzzzzz Loading…
BWD sm100 2cta
#2202 opened Jan 23, 2026 by tzadouri Loading…
[Cute, SM100] Fix comment in tmem_p_offset
#2201 opened Jan 22, 2026 by Edenzzzz Loading…
Warn when ninja is missing
#2191 opened Jan 17, 2026 by blueberrycongee Loading…
[Cute][Testing] Protyping a fast test mode for Cute
#2188 opened Jan 16, 2026 by drisspg Loading…
[Cute] Add torch.compile support for FA4
#2164 opened Jan 9, 2026 by gilfordting Loading…
[Cute] Update deprecated cute DSL APIs
#2148 opened Jan 7, 2026 by henrylhtsang Loading…
[Cute,Fwd,Sm100] fp8 e4m3 and e5m2 support
#2109 opened Dec 29, 2025 by dcw02 Loading…
ProTip! no:milestone will show everything without a milestone.