FL
fla-org/flash-linear-attention
🚀 Efficient implementations for emerging model architectures
4.8k 484 +10/wk
GitHub
large-language-models machine-learning-systems natural-language-processing sequence-modeling
Trend
3
Star & Fork Trend (18 data points)
Stars
Forks
Multi-Source Signals
Growth Velocity
fla-org/flash-linear-attention has +10 stars this period . 7-day velocity: 0.3%.
Deep analysis is being generated for this repository.
Signal-backed technical analysis will be available soon.
| Metric | flash-linear-attention | Yuxi | EasyR1 | prompt-master |
|---|---|---|---|---|
| Stars | 4.8k | 4.8k | 4.8k | 4.8k |
| Forks | 484 | 659 | 367 | 469 |
| Weekly Growth | +10 | +8 | +4 | +33 |
| Language | Python | Python | Python | N/A |
| Sources | 1 | 1 | 1 | 1 |
| License | MIT | MIT | Apache-2.0 | MIT |
Capability Radar vs Yuxi
flash-linear-attention
Yuxi
Maintenance Activity 100
Last code push 1 days ago.
Community Engagement 50
Fork-to-star ratio: 10.0%. Active community forking and contributing.
Issue Burden 70
Issue data not yet available.
Growth Momentum 52
+10 stars this period — 0.21% growth rate.
License Clarity 95
Licensed under MIT. Permissive — safe for commercial use.
Risk scores are computed from real-time repository data. Higher scores indicate healthier metrics.