Systems | Development | Analytics | API | Testing

Kafka Service Bundle: Managed Apache Kafka Without Lock-In

Apache Kafka delivers unmatched performance for real-time data streaming — but managing it in-house requires deep expertise. That’s where OpenLogic’s Kafka Service Bundle comes in. This managed Apache Kafka solution helps enterprises simplify operations, control costs, and maintain full ownership of their data — without the vendor lock-in of commercial clouds. Key benefits: With OpenLogic, your business gets the freedom and flexibility of open source Kafka — supported by the expertise and reliability enterprises depend on.

Demo: Streaming Agents for price matching, with RAG, observability, and Real-Time Context Engine

Streaming Agents enable you to build, deploy, and orchestrate event-driven agents on Apache Flink and Apache Kafka. Embedded in the stream, they can tap into the latest enriched data and be the eyes and ears of a business, continuously monitoring and acting on live operational events. In this demo, Brenner Heintz, Staff Technical Marketing Manager at Confluent, shows how to build price matching agents, do vector search for retrieval augmented generation (RAG), and leverage Confluent’s Real-Time Context Engine to process and serve fresh context the moment it’s needed for AI decision-making.

Keynote: Building Intelligent Systems on Real-time Data

Join Jay Kreps, Confluent leadership, our customers, and industry thought leaders to learn how you can build intelligent systems with real-time data. We’ll show you why streaming is becoming ubiquitous across the business—and how that unlocks a shift-left approach: process and govern at the source, then reuse everywhere. Expect live demos and candid customer stories that make it concrete. Whether you’re a data leader, architect, or builder, you’ll leave with practical playbooks for bringing real-time AI to production. The future is here-let's ignite it together!

Why Apache Kafka Migration Costs Are Often Underestimated

As a critical, stateful system, migrating Apache Kafka deployments is virtually always a complex engineering project where the most significant expenses are often hidden. Scoping and committing to a Kafka migration requires multiple layers of careful calculation involving infrastructure choices, data complexity, team expertise, and risk tolerance. Underestimating these variables leads to blown budgets and extended timelines.

Topic & data multi-Kafka governance with your AI-assistant

If you’ve been running Kafka for a while, with any luck you have quite a few engineering teams onboarded, potentially with hundreds or even thousands of applications. Hopefully the Lenses.io Developer Experience platform helped in this adoption. But finding the right balance between governance and openness can be tricky.

Lenses 6.1 - Kafka connectivity to your Copilot & self-service data replication

Here at Lenses we’re as always laser-focused on making engineering streaming apps and managing Kafka, not just less stressful, but delightful. 6.1 is another big step forward. It starts with the Lenses MCP Server. It connects your AI assistant such as Cursor and Claude, bringing the knowledge of the internet with the context of your Kafka environment. It has the power to transform the work of engineers building and managing streaming apps.

The True Cost of Real-Time Data Streaming

Thanks to ever-increasing adoption technologies like Apache Kafka and Apache Flink, the continuous movement and streaming of real-time data has transformed how modern businesses operate… but is the cost of data streaming worth it? From powering personalized recommendations to enabling instant fraud detection, streaming is often seen as synonymous with innovation and competitive advantage. But like any investment, the cost-benefit equation has to make sense.

How to Build Real-Time Compliance & Audit Logging With Apache Kafka

Traditionally, compliance teams have had to rely on batch exports for their audit logs, a method that, while functional, is proving to be woefully inadequate in today's fast-paced digital landscape. The truth is, waiting hours, or even days, for batch exports of your audit data leaves your organization vulnerable.