Systems | Development | Analytics | API | Testing

Introducing Kong A2A and MCP Metrics: Visibility and Governance for AI Tool Adoption at Scale

Scaling LLM and agentic AI adoption from pilot programs to enterprise-wide deployments is a massive logistical rollout. As AI and agentic usage grow, so does a nagging question for leadership: **Are agents using the right tools to get the job done?** While raw infrastructure metrics might tell you if a server is "up," they fail to tell you if your AI investment is being leveraged.

Automating Agreement Workflows with Kong Konnect and Docusign for Developers

Digital agreements are at the heart of many critical business processes. As companies modernize their technology stacks and adopt API-driven architectures, integrating agreement workflows directly into applications has become increasingly important. Traditional agreement processes were slow and heavily manual. Documents were often created in office tools, shared through email, printed, signed physically, and stored across multiple systems.

No More Static Secrets: Kong Expands Cloud-Native Authentication Support

How Kong Gateway 3.14 closes the consistency gap in IAM-based authentication across AWS, Azure and GCP — and what it means for your production deployments Enterprise security teams have clear requirements: no static credentials, no exceptions. Every service-to-service connection, whether it's Kong talking to databases, caches, or vaults, should authenticate using the same IAM-based identity model that governs the rest of their cloud infrastructure.

Beyond Static Routing: Modernizing API Logic with Conditional Policy Execution

Modern API architectures are no longer linear. A single request can traverse multiple layers of authentication, transformation, enrichment, and observability. As these flows grow more dynamic, the need for fine-grained control over when plugins execute becomes critical. For years, the standard approach to API Gateway configuration followed a strict hierarchical model: you applied a plugin to a Service, a Route, or a Consumer.

Govern the Full AI Data Path with Kong AI Gateway 3.14

The shift from single-model AI features to multi-agent pipelines is no longer a future concern — it's running in production today. MCP has become the de facto protocol for tool-calling, agent-to-agent (A2A) communication patterns are proliferating, and enterprise teams are wiring together complex AI workflows that span multiple providers, services, and agents. Every hop in that data path is an opportunity for something to go wrong. The challenge is governance.

Introducing Kong Agent Gateway: The Complete AI Gateway for Agent-to-Agent Communication

Kong Agent Gateway Is Here — And It Completes the AI Data Path You had a request going to a model, a response coming back, and a gateway in between to enforce policy. With the right solutions, this becomes manageable pretty quickly.. That world is over. Today's agentic architectures look nothing like that. Agents are delegating tasks to other agents via A2A. These other agents are producing and consuming event streams.

Build vs Buy: The Hidden Costs of DIY MCP Server Infrastructure

You whipped up a simple MCP server prototype over the weekend. It routed a single AI agent to a few internal tools, your demo impressed leadership, and the team asked the dreaded question: "When can we ship?" You smiled and said, "Give me two weeks." Fast-forward three months**.** You’re firefighting expired tokens at 2 AM. The compliance team is camped in your inbox. Your once-elegant codebase is now a distributed systems nightmare. Sound familiar?

Open Banking: The Guide on APIs, Regulations, and the Future of Finance

The global financial services industry is undergoing a massive, API-driven revolution. With the global open banking market valued at $31.61 billion in 2024 and projected to grow to $135.17 billion by 2030, this shift is accelerating worldwide. This definitive guide explores the core APIs, the evolving global regulations (including FAPI 2.0, PSD3, and Section 1033), and the massive opportunities shaping the future of finance for banks, fintechs, and enterprises.

LLM Cost Management: How to Implement AI Showback and Chargeback

Every enterprise moving AI into production is about to face a familiar problem in an unfamiliar form: the cost explosion, but for LLMs. This is *very *similar to what happened with cloud. In the early days of cloud, teams spun up infrastructure with no visibility into who was consuming what. Finance got the bill. Engineering got the blame. No one had the data to make good decisions. It took years of hard-won FinOps discipline to fix that. LLM spend is on the same trajectory *and moving faster*.

5 Best Practices for Securing Microservices at Scale

The microservices revolution promised agility and scalability. Teams could deploy faster, scale independently, and innovate without monolithic constraints. You gain speed and flexibility, but you also multiply trust boundaries, identities, network paths, and policy decisions. Then came AI, and everything changed. In 2025, the security reality for AI-integrated microservices is stark.