llmgateway - unified LLM provider gateway
Self-hostable gateway that routes requests across Anthropic, OpenAI, and other LLM providers with API-key management, analytics, and per-team policies. Designed for multi-provider agent deployments.
This entry doesn't have a long-form writeup yet. Follow the source link above for the full context.
Featured in
Related entries
compose-for-agents - Docker Compose recipes for agents
Docker's collection of ready-to-use Compose stacks for orchestrating open-source LLMs, tools, and agent runtimes. Useful starting points for self-hosted setups.
OpenSail - open AI workspace platform
Self-hostable platform for building, running, and sharing AI workspace agents and apps with any model. No vendor lock-in - bring your own LLM provider or run local.
dcm - DockerComposeMaker for self-hosters
Self-hostable web UI for picking and assembling docker-compose.yml files for home servers. Shareable configs and a discovery feed for new containers.
clsh - browser terminal for agent CLIs
PWA that streams a node-pty terminal to any device, letting you reach Claude Code or other agent sessions from a phone or tablet. Self-hosted with no cloud relay.