TL;DR
AI development is shifting from apps to internal platforms. In this episode, Stefan and Jens explore how the Model Context Protocol (MCP) enables LLMs to use APIs as tools, how GraphQL Federation can act as a gateway for MCP, and why the next wave of AI innovation will come from well-structured internal developer platforms.
Turning GraphQL Federation into an MCP Gateway
Jens walks through how WunderGraph’s federation layer can automatically expose GraphQL operations as MCP tools. Instead of manually wiring APIs to AI agents, the federation router generates schemas, validates inputs, and provides a single gateway for all operations.
Every operation in your GraphQL schema can become a tool in MCP. You don’t have to build adapters — the router does it for you.
This approach treats the federated schema as a tool registry, giving AI systems a unified way to query, mutate, and compose data — all with proper type safety and validation.
JSON Schema: The Bridge Between APIs and AI
Stefan highlights how JSON Schema generation is the missing piece connecting GraphQL and MCP. By deriving schemas from existing operations, LLMs gain the structured context they need to make safe, predictable calls.
The magic is in generating the JSON Schema automatically — it’s how you make the AI understand what’s possible.
This turns the federated schema into a living contract for AI tools, ensuring every action is validated before it’s executed.
From Apps to Internal Developer Platforms
The conversation shifts to how AI is changing the way companies build software. Instead of dozens of apps, teams will build internal platforms where AI agents orchestrate data and workflows through APIs.
The future isn’t building more dashboards — it’s building the platform your AI can use to get work done.
By abstracting away integration complexity, these platforms empower AI to execute business logic safely, using the same APIs developers already trust.
Why Schema-Aware Design Matters
Jens emphasizes that schema awareness isn’t just a nice-to-have — it’s essential for safe AI orchestration. Without strict contracts, agents guess; with contracts, they reason.
The schema is the guardrail. It’s how you limit harm and give AI permission to act.
This philosophy mirrors how GraphQL revolutionized frontend integration — but now applied to AI agents acting across systems.
Live Demo: LLMs Calling GraphQL Operations
The episode includes a live demonstration of an MCP-powered workflow, showing how Claude can call federated operations directly. The AI uses the router to discover capabilities, validate inputs, and perform actions — no custom glue code required.
Claude sees the schema, understands the tools, and calls them safely. That’s the future of AI integration.
Looking Ahead
Stefan and Jens close with a vision of AI-native platforms where federation, schema introspection, and AI orchestration converge.
We’re not building AI features — we’re building the infrastructure AI needs to build features.
The takeaway: the future of API development is schema-first, AI-aware, and internally focused.
This episode was directed by Jacob Javor. Transcript lightly edited for clarity and flow.
