This month, we're bridging the gap between APIs and AI-native applications. You can now generate Model Context Protocol (MCP) servers directly from your API specifications, making your APIs instantly accessible through tools like Claude Desktop and Cursor. We've also expanded webhook and callback support across PHP and Go SDKs with built-in security features, and extended our LLM context generation to include event-based APIs.
We’ve also rolled out major UX improvements to the API Playground to surface important configurations like API base URLs and environments, making it even simpler for Developers to test APIs in the Developer Portal.
The Model Context Protocol (MCP) enables AI applications, such as Claude Desktop, VS Code, and Cursor, to connect seamlessly with APIs. We've added MCP Server generation to the APIMatic Platform to help API providers generate AI-native tools for their APIs. This means developers and non-technical users alike can call your APIs directly through a natural language interface, using their favourite AI tools.
For APIMatic’s SDK or Developer Portal users, generating an MCP server is as simple as turning on a feature flag - APIMatic takes care of the rest:
Join the Alpha Program:
MCP Server generation is currently in alpha. Email support@apimatic.io to get early access and help shape this capability.
🔗 See the MCP Server Generation documentation for more details.
You can now easily handle API notifications and event-driven workflows in your PHP and Go SDKs—no extra setup required. With built-in type-safe handlers and automatic signature verification, you can manage asynchronous events with greater reliability and less boilerplate code.
Previously, developers had to manually parse event payloads, map data to SDK types and verify HMAC signatures. Now, your SDKs automatically generate type-safe webhook handlers for receiving events, type-safe callback handlers for async responses, and built-in HMAC signature verification for secure event validation.
This enhancement simplifies event-driven API integration and ensures a consistent developer experience across all supported SDKs.
🔗 See the changelog to learn more.
You can now leverage AI assistants like ChatGPT and Claude to answer questions about your event-based APIs. We've extended LLM context generation to include Webhooks and Callbacks in both llms.txt and llms-full.txt files.
This means developers working with your API can ask AI tools about webhook payload structures, callback implementations, and signature verification—and get accurate, contextual answers instantly. Your llms.txt now includes direct links to webhook documentation for quick discovery, while llms-full.txt provides complete event descriptions, payload schemas with data types, and code samples across supported frameworks.
With this release, you give your developers the same AI-powered support for event-based interactions that they already have for standard endpoints, making your entire API surface more accessible and easier to integrate.
🔗 See LLM context generation documentation for more details.
Previously, essential settings such as environment selection and Base URL inputs were hidden behind a Configure button, which made them hard to locate. The redesigned layout now surfaces these settings directly on the endpoint page for quicker access and a more consistent experience.
Key Enhancements:
🔗 See the changelog to learn more.
Your feedback makes our product better.
Follow us on LinkedIn to stay updated with the latest news from APIMatic!