CB

ymcui/Chinese-BERT-wwm

Pre-Training with Whole Word Masking for Chinese BERT(中文BERT-wwm系列模型)

10.2k 1.4k +2/wk
GitHub
bert bert-wwm bert-wwm-ext chinese-bert nlp pytorch rbt roberta roberta-wwm tensorflow
Trend 3

Star & Fork Trend (31 data points)

Stars
Forks

Multi-Source Signals

Growth Velocity

ymcui/Chinese-BERT-wwm has +2 stars this period . 7-day velocity: 0.0%.

Deep analysis is being generated for this repository.

Signal-backed technical analysis will be available soon.

Metric Chinese-BERT-wwm LLMsPracticalGuide llm-engineer-toolkit CoreNLP
Stars 10.2k 10.2k10.1k10.1k
Forks 1.4k 7871.6k2.7k
Weekly Growth +2 -2+4+0
Language Python N/AN/AJava
Sources 1 111
License Apache-2.0 N/AApache-2.0GPL-3.0

Capability Radar vs LLMsPracticalGuide

Chinese-BERT-wwm
LLMsPracticalGuide
Maintenance Activity 0

Last code push 268 days ago.

Community Engagement 68

Fork-to-star ratio: 13.6%. Active community forking and contributing.

Issue Burden 70

Issue data not yet available.

Growth Momentum 41

+2 stars this period — 0.02% growth rate.

License Clarity 95

Licensed under Apache-2.0. Permissive — safe for commercial use.

Risk scores are computed from real-time repository data. Higher scores indicate healthier metrics.