microsoft/LLMLingua
[EMNLP'23, ACL'24] To speed up LLMs' inference and enhance LLM's perceive of key information, compress the prompt and KV-Cache, which achieves up to 20x compression with minimal performance loss.
Star & Fork Trend (19 data points)
Multi-Source Signals
Growth Velocity
microsoft/LLMLingua has +3 stars this period . 7-day velocity: 0.2%.
Deep analysis is being generated for this repository.
Signal-backed technical analysis will be available soon.
No comparable projects found in the same topic categories.
Last code push 2 days ago.
Fork-to-star ratio: 6.0%. Lower fork ratio may indicate passive usage.
Issue data not yet available.
+3 stars this period — 0.05% growth rate.
Licensed under MIT. Permissive — safe for commercial use.
Risk scores are computed from real-time repository data. Higher scores indicate healthier metrics.