sqz - context compressor for LLM CLIs
Rust CLI that compresses prompt and context payloads before they hit the model, trading a small accuracy delta for measurable token-cost savings.
This entry doesn't have a long-form writeup yet. Follow the source link above for the full context.
Featured in
Related entries
tokf - compress command output before it hits the LLM
Config-driven Rust CLI that compresses verbose command output (logs, test runs, build output) before it reaches an agent's context window. Pluggable rules for stripping noise without losing signal.
opensrc - fetch npm package source for AI agents
Vercel Labs CLI in Rust that fetches the actual source for npm packages so AI coding agents can read library internals instead of guessing from type stubs.
claude-code-rust - native Rust port of Claude Code
Rust reimplementation of Anthropic's Claude Code CLI that avoids the upstream V8 heap OOM in long sessions. Daily-driver replacement that keeps the same UX without Node.
lean-ctx - MCP context optimizer for AI coding agents
Rust MCP server plus shell hook that compresses context for Cursor, Claude Code, Copilot, Windsurf, Gemini CLI and 24 other tools. Single binary, zero telemetry.