Systems | Development | Analytics | API | Testing

Leveraging the MCP Registry in Kong Konnect for Dynamic Tool Discovery

As enterprises start deploying AI agents into real systems, a new architectural challenge is emerging. Agents need a reliable way to discover tools, services, and capabilities dynamically, instead of relying on hardcoded integrations. This is where the Model Context Protocol (MCP) ecosystem is rapidly evolving. MCP servers expose tools and capabilities that AI agents can use. However, once organizations begin deploying multiple MCP servers across environments, the question becomes clear.

Kong Simplifies Multicloud Cloud Gateways with Managed Redis Cache

As enterprises race to deploy multicloud architectures and Agentic AI, they face a common bottleneck: "state." To govern AI token usage, manage agent-to-agent communication, or optimize performance via caching, API and AI gateways require a persistence layer to synchronize data. We’re excited to share the GA of Managed Redis cache for Kong Dedicated Cloud Gateways (DCGW).

Configuring Kong Dedicated Cloud Gateways with Managed Redis in a Multi-Cloud Environment

A persistent challenge arises as businesses adopt multicloud architectures and agentic AI: the need for state synchronization. API and AI gateways require a robust persistence layer to synchronize data, whether it's for governing AI token usage, facilitating agent-to-agent communication, or boosting performance through caching.

The Breakdown | API calls and mobile apps

You used an API this morning. Probably before you even got out of bed. That weather app? It's your phone communicating with a server in the cloud — sending a request, getting data back, and displaying it on your screen in seconds. Location. Request format. Expected response. That's the anatomy of an API call. And it's happening constantly across nearly every app on your phone. Hugo Guerrero and Amanda Alcamo break it all down in Episode 2 of The API & AI Breakdown. No jargon. No fluff. Just clarity.

AI Input vs. Output: Why Token Direction Matters for AI Cost Management

In the burgeoning intelligence economy, AI tokens are a metered utility, but enterprise profitability now hinges on a critical distinction: output tokens can cost up to 10x more than inputs, creating a new, invisible risk for cost overruns, particularly with Agentic AI. Learn how Kong AI Gateway and Konnect Metering & Billing provide the essential financial control plane to enforce directional guardrails, protect margins, and turn token consumption into realized revenue.

Governing Claude Code: How To Secure Agent Harness Rollouts with Kong AI Gateway

The AI coding and Agent Harness approach is no longer experimental. This is likely the most impactful agentic AI use case in production today, and Claude Code is one of the solutions really leading the charge. But as engineering teams race to adopt Claude Code across their organizations, a critical question emerges: who's governing all that LLM traffic?

Beyond the Single Payment Provider Lock-in: How Kong Enables Multi-Rail Billing for the AI Era

The recent article on OpenAI overhauling its payment systems to reduce its dependency on Stripe highlights an important tension many digital platform builders face today: how to balance usage-based monetization with the realities of payments infrastructure dependency.