⏩ OpenAI Embraces MCP: The Protocol Era of AI Has Arrived
On this page
Just when the dust was settling on my recent post “MCP: The Ultimate API Consumer (Not the API Killer)”, the landscape has shifted dramatically. In a fascinating development that aligns with the evolution discussed in that piece, OpenAI has announced full support for MCP across its product line.
Sam Altman confirmed that MCP support is immediately available in OpenAI’s Agents SDK, with support for ChatGPT’s desktop app and the Responses API coming soon. As quoted in VentureBeat: “We’re entering the protocol era of AI. This is how agents will actually do things.”
The Convergence Has Begun
This isn’t just another tech announcement—it’s the beginning of a significant consolidation around standards in the AI industry. With both Anthropic and OpenAI now backing the protocol, we’re seeing the emergence of a common language for AI-tool interactions that will benefit the entire ecosystem.
Microsoft has also thrown its weight behind MCP, releasing a Playwright-MCP server that allows AI assistants like Claude to browse the web and interact with sites using the Chrome accessibility tree. This collaboration demonstrates how quickly major tech players are aligning around MCP as a standard.
What makes this particularly fascinating is how quickly it’s happening. The industry seems to be converging on what many in the API space have observed: MCP isn’t replacing APIs—it’s amplifying their usage and effectiveness by creating standardized ways for AI to consume them.
OAuth2 Support: Enterprise-Grade Security Arrives
One of the key concerns I highlighted in my original article was security—specifically the expanded “blast radius” of potential vulnerabilities when AI agents interact with systems on our behalf. The MCP community has moved quickly to address this.
The recent addition of OAuth 2.1 support to the MCP specification may prove to be a crucial factor in enterprise adoption. MCP’s authorization specification now implements OAuth 2.1 with appropriate security measures for both confidential and public clients, enabling secure authentication between clients and restricted servers.
This matters because it transforms MCP from an interesting technical experiment into something that can be deployed in production environments with proper security controls. OAuth 2.1 provides the authorization layer that enterprises require before they’ll consider integrating AI systems with their critical business data.
What This Means for API Ecosystems
The rapid adoption we’re seeing suggests that MCP is indeed functioning as an API consumer rather than an API replacement. MCP’s growing success appears to be driving more API traffic as AI agents tap into more services through standardized connections.
Let’s think about what this means for different players in the ecosystem:
- For API Producers: The opportunity to reach a new class of consumers—AI agents—with minimal additional work. Your existing REST APIs can be wrapped in MCP servers, instantly making them available to a wide range of AI assistants.
- For Developers: A significant reduction in integration complexity. Instead of building custom connectors for each AI platform, you can build once for MCP and reach multiple platforms.
- For Platform Providers: The ability to offer a richer ecosystem of integrations without having to build and maintain every connection themselves.
- For End Users: More capable AI assistants that can seamlessly work with their existing tools and data.
Local vs. Remote: The Next Evolution
While most of the initial MCP implementations focused on local connections (running servers on your own machine), we’re already seeing movement toward remote MCP servers. Cloudflare just announced support for building and deploying remote MCP servers, creating the opportunity to reach users who aren’t going to install and run MCP servers locally.
This transition from local to remote MCP connections mirrors the evolution we saw from desktop software to web-based applications. It’s a necessary step to reach mainstream adoption, and the OAuth 2.1 support we discussed earlier becomes even more critical in this context.
Looking Forward: It’s APIs All the Way Down
The “protocol era of AI” is here, and it’s built on APIs. Rather than replacing our API management strategies, MCP reinforces their importance while opening new possibilities for how we design, document, and deploy APIs.
It’s incredible how quickly the ecosystem is aligning around this standard. When both OpenAI and Anthropic—fierce competitors in the frontier AI space—agree on an approach, it signals a potential inflection point for the industry.
I expect we’ll see an explosion of MCP servers in the coming months, creating new opportunities for API providers to extend their reach and impact. And as we build more sophisticated AI agents, the symbiotic relationship between MCP and APIs will only grow stronger.
As suggested in the original article: with MCP, it appears to be APIs all the way down.
What are your thoughts on these developments? Do you see OpenAI’s adoption changing the MCP landscape? Let me know!
If you have more thoughts on the coming wave of technology, reach out to me on LinkedIn: https://linkedin.com/in/kevinswiber.