LC

Maximilian-Winter/llama-cpp-agent

The llama-cpp-agent framework is a tool designed for easy interaction with Large Language Models (LLMs). Allowing users to chat with LLM models, execute structured function calls and get structured output. Works also with models not fine-tuned to JSON output and function calls.

623 69 -3/wk
GitHub
agents function-calling llamacpp llm llm-agent llm-framework llms parallel-function-call
Trend 0

Star & Fork Trend (35 data points)

Stars
Forks

Multi-Source Signals

Growth Velocity

Maximilian-Winter/llama-cpp-agent has -3 stars this period . 7-day velocity: -0.5%.

Deep analysis is being generated for this repository.

Signal-backed technical analysis will be available soon.

Metric llama-cpp-agent TrustLLM Awesome-Long-Chain-of-Thought-Reasoning lix
Stars 623 623623623
Forks 69 672716
Weekly Growth -3 +0+0+1
Language Python PythonN/ATypeScript
Sources 1 111
License NOASSERTION MITN/AN/A

Capability Radar vs TrustLLM

llama-cpp-agent
TrustLLM
Maintenance Activity 86

Last code push 31 days ago.

Community Engagement 55

Fork-to-star ratio: 11.1%. Active community forking and contributing.

Issue Burden 70

Issue data not yet available.

Growth Momentum 30

No measurable growth in the current period (first-day cold start expected).

License Clarity 30

No clear license detected — proceed with caution.

Risk scores are computed from real-time repository data. Higher scores indicate healthier metrics.