Systems | Development | Analytics | API | Testing

What Is an MCP Gateway? Key Features and Benefits

API protocols evolve every few years. We have moved from SOAP to REST, then to GraphQL, gRPC, and AsyncAPI for event-driven systems. Now with the rise of large language models (LLMs) and AI agents, organizations need a new class of interfaces that allow agents to take action across real systems, not just generate text. LLMs are powerful reasoning engines, but they lack context. They cannot perform actions by themselves, see real-time data, private information, or internal systems.

How to Engage AI for Calculating Credit Scoring?

Across the globe, 1.5 billion people remain unbanked, without access to even the most basic financial services. For the rest, fewer than 50% of the banked population qualify for formal credit, limiting both financial inclusion and lending growth. In an era where traditional credit models struggle to assess evolving financial behaviors, AI credit scoring is emerging as a strategic differentiator for banks and fintechs alike.

Honeybadger year in review: What we shipped in 2025

Happy holidays! 2025 has been a busy and productive time here at 'Badger HQ. While we shipped a lot of cool things, four features really stand out as we look back on the year. Our top features in 2025 include: We also attended some conferences this year! MicroConf, RailsConf, Laracon, ElixirConf, Rocky Mountain Ruby, and SF Ruby: It was great connecting with so many folks in person, discussing application monitoring, and sharing some delicious meals.

The Age of AI Connectivity

Kong was born to connect. The world is shifting from connecting cloud services with apps to connecting LLMs through agents. API calls and tokens are moving in tandem; a new unit of intelligence is forming. As AI traffic explodes into hypervolumes, speed is all that matters. The same principles of performance, security, and reliability behind Kong are essential in an agentic world. A new connectivity layer for AI is born.

Expanded Observability, Orchestration, and Security with Kong Gateway 3.13

Discover expanded OpenTelemetry support, full cloud-native authentication, powerful Datakit orchestrations, and PCI DSS 4.0 attestation for Kong Konnect and Cloud Gateways. As API ecosystems grow more complex, maintaining visibility and security shouldn't be a hurdle. Kong Gateway 3.13 simplifies these challenges with expanded OpenTelemetry support and more flexible orchestration. These new capabilities not only make your APIs more observable but also make it easier to implement orchestration.

An Early Christmas Present for the AI C-Suite: Metering & Billing Comes to Kong Konnect

Happy holidays from the Kong team! Consider this our gift to everyone who's been asked "what's the ROI on AI?" one too many times. The AI boom has a dirty secret: for most enterprises, it's bleeding money. Every LLM call, every agent invocation, every API request that powers your AI products — they all cost something. And right now, most organizations have no idea what they're spending, who's spending it, or how to control that spend. Finance teams are flying blind.

Move More Agentic Workloads to Production with AI Gateway 3.13

Kong AI Gateway 3.13 moves enterprises from AI experimentation to shipping production-grade agents by unlocking new capabilities focused on agentic security, developer productivity, and resilience, including MCP tool-level access control, expanded provider support, and smarter load balancing.

An Enterprise Guide to PCI DSS Compliance Requirements

If your company handles customer payment information, it’s critical for you to understand PCI DSS compliance requirements. A single breach can result in substantial financial penalties and damage your brand's reputation. In my experience working with enterprise customers, I’ve seen firsthand how non-production environments often become a blind spot for compliance efforts.

Accelerate your Releases with AI-Driven Test Prioritization

Testing is changing, and AI is leading the next step Every QA team faces the same pressure: test more, deliver faster, and never miss a defect. But as projects grow and release cycles shorten, running every test in every sprint isn’t always realistic. The challenge isn’t just about automating more, it’s about deciding what to test first. That’s where AI comes in.