Systems | Development | Analytics | API | Testing

[Workshop 101] Build an Enterprise-Grade AI Platform with Kong AI Gateway

Join us for an in-depth session, "Build an Enterprise-Grade AI Platform with Kong AI Gateway," where we'll explore how organizations can securely operationalize AI at scale. Discover how Kong AI Gateway unifies APIs and AI services under one platform—enabling centralized governance, cost-efficient observability, and consistent policy enforcement across LLMs, embeddings, and AI workflows. Learn how to accelerate innovation while maintaining compliance and performance through capabilities like prompt introspection, semantic caching, and dynamic routing across private and public AI models.

No AI Without API: Build AINative Apps with Konnect, AI Gateway & Event Gateway

Agents are just applications with an intelligence layer. This session shows why secure, reliable, discoverable APIs—and a developer platform—are the foundation for AI-native workflows. See live demos of Kong Konnect, AI Gateway, Event Gateway, MCP-enabled dev portals, AI Composer/Runner, and Kai. What you’ll learn: Learn more: Subscribe for more on API platforms, service connectivity, and AI-native architecture.

[Developer Day] Developer Portal, Observability & Identity with Kong Konnect

This hands-on Developer Day builds on the first two workshops (Platform setup and APIOps automation) to deliver a complete developer experience. You’ll stand up a Kong Konnect Developer Portal with self-registration and application onboarding, wire the portal to an OIDC identity provider for SSO and token issuance, and publish APIs and docs from your APIOps pipeline. In parallel, you’ll configure Konnect Analytics to capture metrics and usage data from your Kong gateways and services.

The Connectivity Layer of AI | Kong Konnect for Agentic Digital Experiences

APIs are the nervous system of AI. In this API Summit 2025 keynote, Kong outlines the AI connectivity layer: unifying API and LLM traffic, adopting agent protocols like MCP, introducing an AI gateway pattern, and enabling usage-based monetization with realtime metering and billing in Konnect. Key takeaways: Learn more: Subscribe for more on API and AI connectivity. Explore Kong Konnect to standardize governance, scale LLM/agent traffic, and monetize usage.

How to Secure Your APIs with Kong Gateway Plugins

Have you ever wondered how developers can add features like security and traffic control to their apps without spending weeks coding? Kong's API gateway plugins are the answer. Think of your app as a secure office building, your API as the lobby, and Kong Gateway as the head of security you've hired to manage it. Plugins are the security team and tools—like a guard managing traffic flow or a system to scan employee badges—that protect your "lobby.".

How AI Agents Actually Call APIs: 5 Common Misconceptions

Ever wondered how AI agents and Large Language Models (LLMs) connect to real-world data and services? It’s not magic—it’s a well-structured process. This video breaks down the five most common misunderstandings about how LLMs call APIs, databases, and other custom tools. We explain the crucial role of the Model Context Protocol (MCP) in creating reliable and powerful AI agents. In this video, we'll cover.

What is an API gateway?

An API Gateway is the professional digital bouncer at the door of your company's digital services. It creates a single entry point to improve security and organization, inspecting every request and routing it to the microservice it needs to reach. It checks IDs (authentication), directs traffic (routing), and provides crowd control (rate limiting) to make sure everything runs smoothly.

PII Sanitization with Kong

Using sensitive user data for analytics, development, or training AI models introduces significant security risks like data breaches and costly PII (Personally Identifiable Information) leakage. These incidents can lead to heavy fines and a critical loss of customer trust. Watch this demo to see how the Kong AI Gateway automatically finds and sanitizes PII in real-time before requests ever reach your upstream services or Large Language Models (LLMs).