BI

jxiw/BiGS

Official Repository of Pretraining Without Attention (BiGS), BiGS is the first model to achieve BERT-level transfer learning on the GLUE benchmark with subquadratic complexity in length (or without attention).

118 8 +0/wk
GitHub
deep-learning natural-language-processing
Trend 0

Star & Fork Trend (116 data points)

Stars
Forks

Multi-Source Signals

Growth Velocity

jxiw/BiGS has +0 stars this period . Velocity data will be available after more historical data is collected.

Deep analysis is being generated for this repository.

Signal-backed technical analysis will be available soon.

Metric BiGS nagato-ai STAMP dashscope-sdk
Stars 118 118118118
Forks 8 105213
Weekly Growth +0 +0+0+0
Language Python PythonPythonC#
Sources 1 111
License Apache-2.0 MITMITMIT

Capability Radar vs nagato-ai

BiGS
nagato-ai
Maintenance Activity 0

Last code push 766 days ago.

Community Engagement 34

Fork-to-star ratio: 6.8%. Lower fork ratio may indicate passive usage.

Issue Burden 70

Issue data not yet available.

Growth Momentum 30

No measurable growth in the current period (first-day cold start expected).

License Clarity 95

Licensed under Apache-2.0. Permissive — safe for commercial use.

Risk scores are computed from real-time repository data. Higher scores indicate healthier metrics.

Track 10,000+ repos like this one

AISignal Weekly — top breakouts + research, every Friday. Free.

Free weekly AI intelligence digest

Need help implementing BiGS in production?

FluxWise Agentic AI Platform — 让AI真正替你干活