Back to reviews
tldr MCP Gateway

tldr MCP Gateway

Shrink 41+ MCP tool schemas by 86% before they hit your model

tldr is a local proxy that sits between your AI coding harness and upstream MCP servers, solving one of the most underappreciated problems in agentic workflows: context bloat from tool schema proliferation. When you connect GitHub MCP, filesystem MCP, and a few others, you can easily be sending 24,000+ tokens of tool schemas to the model before any work begins. Instead of passing all those schemas directly, tldr exposes exactly five wrapper tools to the model: search_tools, execute_plan, call_raw, inspect_tool, and get_result. The model learns which underlying tools exist on-demand through search_tools, then calls them through the proxy. GitHub MCP's 24,473-token schema surface compresses to 3,482 tokens — an 86% reduction. Output responses are further compressed through field stripping, a 4,096-token cap, and a 64KB byte limit. This is a genuinely practical solution for power users running multi-MCP setups who've noticed degraded performance as their tool count grows. The tradeoff is one extra hop of indirection, but the token savings pay for themselves in improved model attention and lower API costs.

Panel Reviews

The Builder

The Builder

Developer Perspective

Ship

This solves a real problem I've hit personally — when you connect enough MCP servers, you're wasting a quarter of your context window on tool definitions before a single line of code is written. The five-wrapper-tool approach is elegant and the compression numbers are concrete and reproducible.

The Skeptic

The Skeptic

Reality Check

Skip

This is a workaround for a problem that MCP server authors and model providers should fix natively. Adding another proxy layer to your local development setup increases debugging complexity, and the 4,096-token output cap could silently truncate important data from tool responses.

The Futurist

The Futurist

Big Picture

Ship

Schema proliferation is becoming a real scalability ceiling for agentic systems. tldr's dynamic tool discovery approach — where the model learns which tools exist on-demand — hints at how future agent routing layers will work at scale across hundreds of specialized MCP endpoints.

The Creator

The Creator

Content & Design

Ship

For anyone using AI agents to manage creative workflows across multiple platforms, the context savings translate directly to more coherent, focused outputs. Less schema bloat means the model spends more attention on your actual task.

Community Sentiment

Overall155 mentions
57% positive33% neutral10% negative
Hacker News45 mentions

86% schema compression is the headline number

Reddit30 mentions

Practical fix for multi-MCP context bloat

Twitter/X80 mentions

Five wrapper tools design is clever