Systems | Development | Analytics | API | Testing

PII Sanitization Needed for LLMs and Agentic AI is Now Easier to Build

The excitement around large language models (LLMs) and agentic AI is justified. These systems can summarize, generate, reason, and even take actions across APIs — all with minimal human input. However, as enterprises race to integrate LLMs into real-world workflows — especially when those enterprises operate in regulated environments and/or deal in sensitive data — one fundamental question looms large.

Consistently Hallucination-Proof Your LLMs with Automated RAG

AI is quickly transforming the way businesses operate, turning what was once futuristic into everyday reality. However, we're still in the early innings of AI, and there are still several key limitations with AI that organizations should remain aware of to ensure that AI is being leveraged in a safe and productive way.

Katalon's 2025 State of Software Quality Report reveals insights from 1,500 QA professionals worldwide

Despite fears of job loss, QA professionals are leaning into AI faster than ever, according to Katalon’s newly released 2025 State of Software Quality Report. The report reveals that testers using AI tools are twice as likely to fear being replaced by them, a paradox that underscores the profession’s evolving relationship with automation.

Key takeaways from our research: The rise of Large Language Models - transforming AI and beyond

Large language models (LLMs) have redefined artificial intelligence (AI), pushing the boundaries of natural language processing (NLP) and enabling machines to understand, generate, and manipulate human-like text. From chatbots and content creation to legal and medical applications, LLMs are transforming industries at an unprecedented pace. But what makes these models so powerful? How do they work? And what challenges do they pose?

Introducing Kong's New MCP Server to Access Your API System of Record

MCP is a new way to integrate LLMs and AI agents with third-party data sources and APIs. It significantly improves how we build tool integrations by eliminating duplicated code and providing a centralized interface for multiple agents to access shared tools. Today, we’re excited to announce the release of Kong’s MCP Server for the Kong Konnect platform. This empowers customers to integrate AI agents and query LLMs to discover APIs, services, and traffic analytics in real time.