GN

EleutherAI/gpt-neox

An implementation of model parallel autoregressive transformers on GPUs, based on the Megatron and DeepSpeed libraries

7.4k 1.1k +1/wk
GitHub
deepspeed-library gpt-3 language-model transformers
Trend 3

Star & Fork Trend (33 data points)

Stars
Forks

Multi-Source Signals

Growth Velocity

EleutherAI/gpt-neox has +1 stars this period . 7-day velocity: 0.0%.

Deep analysis is being generated for this repository.

Signal-backed technical analysis will be available soon.

Metric gpt-neox caveman Prompt_Engineering kreuzberg
Stars 7.4k 7.4k7.4k7.5k
Forks 1.1k 297952368
Weekly Growth +1 +1,488+8+9
Language Python PythonJupyter NotebookRust
Sources 1 111
License Apache-2.0 MITNOASSERTIONNOASSERTION

Capability Radar vs caveman

gpt-neox
caveman
Maintenance Activity 67

Last code push 65 days ago.

Community Engagement 74

Fork-to-star ratio: 14.9%. Active community forking and contributing.

Issue Burden 70

Issue data not yet available.

Growth Momentum 41

+1 stars this period — 0.01% growth rate.

License Clarity 95

Licensed under Apache-2.0. Permissive — safe for commercial use.

Risk scores are computed from real-time repository data. Higher scores indicate healthier metrics.