Systems | Development | Analytics | API | Testing

Announcing Kubernetes Ingress Controller 3.5

We're happy to announce the 3.5 release of Kong Ingress Controller (KIC). This release includes the graduation of combined services to General Availability, support for connection draining, as well as the start of deprecating support for some Ingress types as we help move customers to the Kubernetes Gateway API. Let’s get into more details about these!

Kong AI Gateway 3.11: Reduce Token Spend, Unlock Multimodal Innovation

Today, I'm excited to announce one of our largest Kong AI Gateway releases (3.11), which ships with several new features critical in building modern and reliable AI agents in production. We strongly recommend updating to this version to get access to the latest and greatest that AI infrastructure has to offer.

Kong Gateway Enterprise 3.11 Makes APIs & Event Streams More Powerful

We’re excited to bring you Kong Gateway Enterprise 3.11 with compelling new features to make your APIs and event streams even more powerful, including: We’ll also touch on what’s new with Konnect networking and Active Tracing. There’s a lot to unpack, so keep on reading for the full story!

Build Your Own Internal RAG Agent with Kong AI Gateway

RAG (Retrieval-Augmented Generation) is not a new concept in AI, and unsurprisingly, when talking to companies, everyone seems to have their own interpretation of how to implement it. So, let’s start with a refresher. RAG (short for Retrieval-Augmented Generation) is a technique that injects relevant data from an external knowledge source directly into a prompt before sending it to a Large Language Model (LLM). “But wait, my model is already fine-tuned on my domain-specific data.

AI Gateway Benchmark: Kong AI Gateway, Portkey, and LiteLLM

In February 2024, Kong became the first API platform to launch a dedicated AI gateway, designed to bring production-grade performance, observability, and policy enforcement to GenAI workloads. At its core, Kong’s AI Gateway provides a universal API to enable platform teams to centrally secure and govern traffic to LLMs, AI agents, and MCP servers. Additionally, as AI adoption in your organization begins to skyrocket, so do AI usage costs.

What is API Security? Fundamentals & Strategies

APIs are the digital lifelines powering modern applications, microservices, IoT devices, and everything in between. They act as the universal translators of data, ferrying information between diverse software platforms. API security encompasses the technologies, practices, and protocols dedicated to protecting these invisible workhorses from unauthorized access, data breaches, and malicious misuse.

Is Ambient Mesh the Future of Service Mesh?

The word on the street is that Ambient Mesh is the obvious evolution of service mesh technology — leaner, simpler, and less resource-intensive. But while Ambient Mesh is an exciting development, the reality is more nuanced. It is more than likely that a sidecar-based mesh is still a better fit for your workload and organization.

Kong Konnect: Introducing HashiCorp Vault Support for LLMs

If you're a builder, you likely keep sending your LLM credentials on every request from your agents and applications. But if you operate in an enterprise environment, you'll want to store your credentials in a secure third-party like HashiCorp Vault or IDP provider and have the infrastructure inject the credentials for you dynamically.

5 Steps to Immediately Reduce Kafka Cost and Complexity

Kafka delivers massive value for real-time businesses — but that value comes at a cost. As usage grows, so does complexity: more clusters, more topics, more partitions, more ACLs, more custom tooling. But it doesn’t have to be that way. If your team is managing Kafka at scale, here are five concrete steps you can take to immediately reduce cost and operational complexity — without sacrificing performance.