OM

jundot/omlx

LLM inference server with continuous batching & SSD caching for Apple Silicon — managed from the macOS menu bar

9.1k 775 +157/wk
GitHub
apple-silicon inference-server llm macos mlx openai-api
Trend 3

Star & Fork Trend (38 data points)

Stars
Forks

Multi-Source Signals

Growth Velocity

jundot/omlx has +157 stars this period . 7-day velocity: 4.1%.

Deep analysis is being generated for this repository.

Signal-backed technical analysis will be available soon.

Metric omlx oumi inference seatunnel
Stars 9.1k 9.2k9.2k9.2k
Forks 775 7448132.2k
Weekly Growth +157 +2+4+0
Language Python PythonPythonJava
Sources 1 111
License Apache-2.0 Apache-2.0Apache-2.0Apache-2.0

Capability Radar vs oumi

omlx
oumi
Maintenance Activity 100

Last code push 0 days ago.

Community Engagement 42

Fork-to-star ratio: 8.5%. Lower fork ratio may indicate passive usage.

Issue Burden 70

Issue data not yet available.

Growth Momentum 100

+157 stars this period — 1.72% growth rate.

License Clarity 95

Licensed under Apache-2.0. Permissive — safe for commercial use.

Risk scores are computed from real-time repository data. Higher scores indicate healthier metrics.