FL

flashinfer-ai/flashinfer

FlashInfer: Kernel Library for LLM Serving

5.4k 877 +8/wk
GitHub
attention cuda distributed-inference gpu jit large-large-models llm-inference moe nvidia pytorch
Trend 3

Star & Fork Trend (26 data points)

Stars
Forks

Multi-Source Signals

Growth Velocity

flashinfer-ai/flashinfer has +8 stars this period . 7-day velocity: 0.5%.

Deep analysis is being generated for this repository.

Signal-backed technical analysis will be available soon.

Metric flashinfer deepreasoning csghub MemMachine
Stars 5.4k 5.4k5.4k5.4k
Forks 877 439665165
Weekly Growth +8 +0+0+0
Language Python RustVuePython
Sources 1 111
License Apache-2.0 MITApache-2.0Apache-2.0

Capability Radar vs deepreasoning

flashinfer
deepreasoning
Maintenance Activity 100

Last code push 0 days ago.

Community Engagement 82

Fork-to-star ratio: 16.4%. Active community forking and contributing.

Issue Burden 70

Issue data not yet available.

Growth Momentum 49

+8 stars this period — 0.15% growth rate.

License Clarity 95

Licensed under Apache-2.0. Permissive — safe for commercial use.

Risk scores are computed from real-time repository data. Higher scores indicate healthier metrics.