IN
SemiAnalysisAI/InferenceX
Open Source Continuous Inference Benchmarking Qwen3.5, DeepSeek, GPTOSS - GB200 NVL72 vs MI355X vs B200 vs GB300 NVL72 vs H100 & soon™ TPUv6e/v7/Trainium2/3
774 121 +4/wk
GitHub
ai amd benchmark cuda gb200 llm nvidia pytorch rocm sglang vllm
Trend
3
Star & Fork Trend (28 data points)
Stars
Forks
Multi-Source Signals
Growth Velocity
SemiAnalysisAI/InferenceX has +4 stars this period . 7-day velocity: 1.3%.
Deep analysis is being generated for this repository.
Signal-backed technical analysis will be available soon.
| Metric | InferenceX | QiZhenGPT | vllm-mlx | CodeFuse-muAgent |
|---|---|---|---|---|
| Stars | 774 | 774 | 775 | 775 |
| Forks | 121 | 88 | 177 | 82 |
| Weekly Growth | +4 | +0 | +1 | -1 |
| Language | Python | Python | Python | Python |
| Sources | 1 | 1 | 1 | 1 |
| License | Apache-2.0 | GPL-3.0 | N/A | NOASSERTION |
Capability Radar vs QiZhenGPT
InferenceX
QiZhenGPT
Maintenance Activity 100
Last code push 0 days ago.
Community Engagement 78
Fork-to-star ratio: 15.6%. Active community forking and contributing.
Issue Burden 70
Issue data not yet available.
Growth Momentum 71
+4 stars this period — 0.52% growth rate.
License Clarity 95
Licensed under Apache-2.0. Permissive — safe for commercial use.
Risk scores are computed from real-time repository data. Higher scores indicate healthier metrics.