LO

microsoft/LoRA

Code for loralib, an implementation of "LoRA: Low-Rank Adaptation of Large Language Models"

13.4k 895 +3/wk
GitHub
adaptation deberta deep-learning gpt-2 gpt-3 language-model lora low-rank pytorch roberta
Trend 3

Star & Fork Trend (25 data points)

Stars
Forks

Multi-Source Signals

Growth Velocity

microsoft/LoRA has +3 stars this period . 7-day velocity: 0.1%.

Deep analysis is being generated for this repository.

Signal-backed technical analysis will be available soon.

Metric LoRA DeepTutor ydata-profiling litgpt
Stars 13.4k 13.5k13.5k13.3k
Forks 895 1.8k1.8k1.4k
Weekly Growth +3 +1,257+2+0
Language Python PythonPythonPython
Sources 1 111
License MIT Apache-2.0MITApache-2.0

Capability Radar vs DeepTutor

LoRA
DeepTutor
Maintenance Activity 0

Last code push 478 days ago.

Community Engagement 33

Fork-to-star ratio: 6.7%. Lower fork ratio may indicate passive usage.

Issue Burden 70

Issue data not yet available.

Growth Momentum 41

+3 stars this period — 0.02% growth rate.

License Clarity 95

Licensed under MIT. Permissive — safe for commercial use.

Risk scores are computed from real-time repository data. Higher scores indicate healthier metrics.