AC

PhoebusSi/Alpaca-CoT

We unified the interfaces of instruction-tuning data (e.g., CoT data), multiple LLMs and parameter-efficient methods (e.g., lora, p-tuning) together for easy use. We welcome open-source enthusiasts to initiate any meaningful PR on this repo and integrate as many LLM related technologies as possible. 我们打造了方便研究人员上手和使用大模型等微调平台,我们欢迎开源爱好者发起任何有意义的pr!

2.8k 251 +0/wk
GitHub
alpaca chatglm chatgpt cot instruction-tuning llama llm lora moss p-tuning parameter-efficient pytorch
Trend 3

Star & Fork Trend (35 data points)

Stars
Forks

Multi-Source Signals

Growth Velocity

PhoebusSi/Alpaca-CoT has +0 stars this period . 7-day velocity: 0.0%.

Deep analysis is being generated for this repository.

Signal-backed technical analysis will be available soon.

Metric Alpaca-CoT LandPPT llm-guard basic-memory
Stars 2.8k 2.8k2.8k2.8k
Forks 251 389368180
Weekly Growth +0 +34+4+8
Language Jupyter Notebook JavaScriptPythonPython
Sources 1 111
License Apache-2.0 NOASSERTIONMITAGPL-3.0

Capability Radar vs LandPPT

Alpaca-CoT
LandPPT
Maintenance Activity 0

Last code push 849 days ago.

Community Engagement 45

Fork-to-star ratio: 9.0%. Lower fork ratio may indicate passive usage.

Issue Burden 70

Issue data not yet available.

Growth Momentum 30

No measurable growth in the current period (first-day cold start expected).

License Clarity 95

Licensed under Apache-2.0. Permissive — safe for commercial use.

Risk scores are computed from real-time repository data. Higher scores indicate healthier metrics.