Systems | Development | Analytics | API | Testing

Prompt Engineering Best Practices You Should Know

Look around yourself. We are swarming in the world of data and AI. From students at school using ChatGPT to complete their assignments to professionals using AI for market research, content creation, or even debugging code, everyone is leveraging the power of large language models (LLMs). Mr. Smith isn’t Googling his tax questions anymore; he’s asking an AI assistant.

What AI Approach is Right for You: LLM Apps, Agents, or Copilots?

The generative AI hype train doesn’t appear to be slowing down, with organizational adoption rising from 33% in 2023 to 78% by the end of 2024. In fact, bigger companies are leading the way in GenAI adoption, with the global AI market projected to grow annually by 36.6% between 2024 and 2030. However, GenAI growth isn’t following a linear path. Organizations are utilizing different AI approaches, depending on their specific use cases.

Automate safer API delivery with Kong Gateway and decK

Moving faster shouldn’t mean giving up control. In this quick demo, see how to go from an OpenAPI spec to a fully configured service in Kong Gateway—with built-in plugins, tagging, backups, and diff checks for safer, repeatable API deployment. Learn how your platform team can use decK to: Promote configurations from non-production to production with confidence If you're looking to streamline API operations without sacrificing governance or security, this walkthrough is a must-watch.

Dev Portal & AI Coding Agents

The AI coding era isn’t coming, it’s already here. Today’s AI agents aren’t just helping developers code faster; they’re starting to drive entire development workflows. In this quick video, see how Kong’s Developer Portal makes it easy for both humans and AI agents to securely discover, consume, and collaborate on APIs across your organization. From accelerating planning to boosting developer productivity, Kong gives you a smarter way to manage APIs at enterprise scale.

Unlocking Seamless Integration with MCP Servers on Choreo

Model Context Protocol (MCP) is an open protocol that standardizes how applications provide context to large language models (LLMs). It’s becoming a foundational layer in many AI-native workflows, especially when working with real-time or continuously updating data sources. We're excited to announce that Choreo now supports the deployment of MCP servers, empowering developers to integrate AI capabilities more efficiently into their applications.

How to Read and Analyze iOS Crash Reports: A Developer's Guide

The crash-proof app doesn’t exist. It never has, and it probably never will. Because apps can crash for all kinds of reasons, some of them impossible to foresee. No matter how well we build them, crashes are going to happen to our apps. So, as devs, we need to know how to react to a crash when it happens. And in this context, understanding crash reports is crucial. They provide the clues we need to put the pieces together.