AI Productivity "Crash", Lovable’s $2B Surge, and Why Federation Needs gRPC

TL;DR

In this episode of The Good Thing, Stefan and Jens discuss why they see gRPC Federation as the next step for GraphQL, the claims that AI coding tools slow developers down, the explosive growth of Lovable, and Meta’s potential move away from open source AI. The conversation mixes technical depth with industry context, making it a must-watch for anyone tracking APIs, AI, and infrastructure strategy.

gRPC Federation as the Next Generation

The episode opens with a deep dive into why gRPC Federation matters. Jens explains that many subgraph frameworks are loosely typed, leaving room for errors at scale. The new model compiles SDL into a gRPC spec, letting the Cosmo Router handle translation, batching, and type safety. (Cosmo Connect )

We move the data loader into the router. Nobody has to implement a data loader anymore—you just implement gRPC and you’re done.

The goal is less complexity, faster execution, and freeing teams from Apollo lock-in.

Type Safety, Performance, and Plugins

Jens highlights the technical benefits of the approach. Compiling subgraph SDL to protobuf enforces end-to-end type safety, while shifting batching to the router removes the N+1 problem. Performance gains come from skipping GraphQL parsing at the subgraph level, leaving only lightweight gRPC calls.

He also points to the plugin system, where Cosmo Router can generate adapters for REST, SOAP, or legacy APIs:

In the long term, nobody should ever write boring integration code again.

AI Tools and the Productivity Debate

The hosts then turn to a widely shared study claiming that AI coding tools slow developers down by 20% (Metr , Time ). The study recruited 16 experienced developers working on open source projects, randomly assigning issues with or without AI assistance.

Sixteen developers is the sample size... that’s nonsense.

Stefan notes that headlines amplified the claim without context, even though the dataset was too small to generalize.

The Lovable Boom

Lovable’s rise sparks debate. The company hit $75 million ARR in under a year with only 45 employees and just raised a $200 million Series A (TechCrunch ). Stefan marvels at the pace, while Jens questions long-term scalability.

It’s phenomenal growth, but will Lovable forever be the scaffolding tool that never scales to big projects?

They compare the moment to the dot-com boom: rapid growth, heavy funding, and the unknown of who survives the bubble.

Meta’s Possible Shift Away from Open Source

The last major topic is Meta’s reported plan to move away from open source AI models (NY Times ). Jens notes that if running a 2-trillion-parameter model requires data centers of GPUs, open sourcing it may not matter in practice. Stefan raises the possibility of government influence on the decision.

It’s not a nuclear arms race, it’s an AI arms race... governments might want these models kept proprietary.

The discussion underscores how AI strategy now intersects with geopolitics.

Closing Thoughts

The episode closes on a lighter note, but the key theme is clear: the future of federation, AI tools, and open source is being redefined. Stefan and Jens bring skepticism where it’s due, but also point to opportunities for developers to escape complexity and focus on real problems.


This episode was directed by Jacob Javor. Transcript lightly edited for clarity and flow.