Maximilian-Winter/llama-cpp-agent
The llama-cpp-agent framework is a tool designed for easy interaction with Large Language Models (LLMs). Allowing users to chat with LLM models, execute structured function calls and get structured output. Works also with models not fine-tuned to JSON output and function calls.
Star & Fork Trend (35 data points)
Multi-Source Signals
Growth Velocity
Maximilian-Winter/llama-cpp-agent has -3 stars this period . 7-day velocity: -0.5%.
Deep analysis is being generated for this repository.
Signal-backed technical analysis will be available soon.
| Metric | llama-cpp-agent | TrustLLM | Awesome-Long-Chain-of-Thought-Reasoning | lix |
|---|---|---|---|---|
| Stars | 623 | 623 | 623 | 623 |
| Forks | 69 | 67 | 27 | 16 |
| Weekly Growth | -3 | +0 | +0 | +1 |
| Language | Python | Python | N/A | TypeScript |
| Sources | 1 | 1 | 1 | 1 |
| License | NOASSERTION | MIT | N/A | N/A |
Capability Radar vs TrustLLM
Last code push 31 days ago.
Fork-to-star ratio: 11.1%. Active community forking and contributing.
Issue data not yet available.
No measurable growth in the current period (first-day cold start expected).
No clear license detected — proceed with caution.
Risk scores are computed from real-time repository data. Higher scores indicate healthier metrics.