DB

ChenRocks/Distill-BERT-Textgen

Research code for ACL 2020 paper: "Distilling Knowledge Learned in BERT for Text Generation".

129 20 +0/wk
GitHub
bert-model knowledge-distillation machine-translation natural-language-processing
Trend 0

Star & Fork Trend (18 data points)

Stars
Forks

Multi-Source Signals

Growth Velocity

ChenRocks/Distill-BERT-Textgen has +0 stars this period . Velocity data will be available after more historical data is collected.

Deep analysis is being generated for this repository.

Signal-backed technical analysis will be available soon.

Metric Distill-BERT-Textgen my-cheat-sheets reComputer-Jetson-for-Beginners faster-chat
Stars 129 129128130
Forks 20 422412
Weekly Growth +0 +0+0+0
Language Python GoPythonJavaScript
Sources 1 111
License MIT MITMITMIT

Capability Radar vs my-cheat-sheets

Distill-BERT-Textgen
my-cheat-sheets
Maintenance Activity 0

Last code push 1743 days ago.

Community Engagement 78

Fork-to-star ratio: 15.5%. Active community forking and contributing.

Issue Burden 70

Issue data not yet available.

Growth Momentum 30

No measurable growth in the current period (first-day cold start expected).

License Clarity 95

Licensed under MIT. Permissive — safe for commercial use.

Risk scores are computed from real-time repository data. Higher scores indicate healthier metrics.