Systems | Development | Analytics | API | Testing

How to Master AI/LLM Traffic Management with Intelligent Gateways

As businesses increasingly harness the power of artificial intelligence (AI) and large language models (LLMs), a new challenge emerges: managing the deluge of AI requests flooding systems. This exponential growth in AI traffic creates what could be considered a gratifying predicament—high demand for your AI services—but also introduces complex challenges that must be addressed for sustainable operations.

How the Rise of Agentic AI is Transforming API Development and Management

The world of artificial intelligence is undergoing a seismic shift, with the emergence of agentic AI redefining the landscape of API development and management. As businesses and developers navigate the complexities of digital transformation, understanding the implications of this groundbreaking technology becomes paramount.

LLM Security: Shield Your AI from Injection Attacks, Data Leaks, and Model Theft

This comprehensive guide will arm you with the knowledge and strategies needed to protect your LLMs from emerging threats. We’ll explore the OWASP LLM Top 10 vulnerabilities in detail and provide actionable approaches to mitigate these risks. Who should read this? By the end of this guide, you’ll have a robust framework for securing your LLMs and ensuring they remain assets rather than liabilities. Let’s dive into the world of LLM security with confidence and clarity.

Everything you need in an API Platform, in Konnect

Bringing our video series to a close with a full run through of how to use all of the components of Konnect to a build your API Platform. Mike and Alex have posted a series of videos breaking down all of the individual API Platform-relevant components of Konnect, and how they can be used to satisfy key API Platform requirements. Now, they’re bringing it all together! In this livestream, join Mike and Alex to discuss API Platform needs from day zero through day two and–even better–watch Mike build it all live.

Kong Event Gateway: Unifying API and Events in a Single Platform

Kong customers include some of the most forward-thinking, tech-savvy organizations in the world. And while we’re proud to help them innovate through traditional APIs, the reality is that their ambitions don’t stop there. Increasingly, our customers are investing heavily in real-time data and event streaming.

What is MCP? Diving Deep into the Future of Remote AI Context

The hype for Anthropic’s Model Context Protocol (MCP) has reached a boiling point. Everyone (including Kong) is releasing something around MCP to ensure they aren't seen as falling behind in the ever-changing AI landscape. However, in this mad dash, there remains confusion around MCP and what this standard actually enables. Some see MCP as a total game-changer, and some see it as little more than a thin and unnecessary wrapper. As usual, the truth lies somewhere in between.

How SeatGeek scaled to 86M+ monthly API requests with Kong Konnect

SeatGeek’s API sprawl was slowing them down—internally and externally. That changed with Kong Konnect. In this quick story, see how SeatGeek used Kong’s API platform to improve visibility, streamline management, and handle over 86 million monthly requests. If you're dealing with fragmented APIs or struggling to scale developer experience, this one's worth a watch.

Streamline AI Usage with Token Rate-Limiting & Tiered Access in Kong

As organizations continue to adopt AI-driven applications, managing usage and costs becomes more critical. Large language models (LLMs), such as those provided by OpenAI, Google, Anthropic, and Mistral, can incur significant expenses when overused. This blog will explore how you can streamline your AI workloads by leveraging Kong’s token rate-limiting and tiered access features.

How to Create a Platform Cross-Charging Model (and Why Not To Do It)

I'm commonly asked by customers for advice on how they can build a good platform cross-charging model for their organization. And my gut reaction is nearly always "don't." We'll come back to why I think that later, but first let's look at what cross-charging means, why you might want it, and how it can be designed.