Systems | Development | Analytics | API | Testing

Demo: Build An AI Meeting Coach With Flink SQL, OpenAI, & Confluent

Learn how to build a real-time meeting coach system that uses Confluent's data streaming platform, Apache Flink SQL, and Azure OpenAI to provide sales teams with timely, contextual advice. In this demo, Brenner Heintz, Staff Technical Marketing Manager at Confluent, demonstrates the system's architecture, key components, and how it leverages company knowledge documents to enhance sales conversations.

Confluent Champion: How Vineet Pursues Engineering Excellence in an Innovation Culture

Based in Delhi, India, Vineet Singh has worked as a Senior Software Engineer at Confluent for the past three years, and he has contributed to various parts of Confluent’s core data streaming engine. Now he’s part of the team that makes Apache Kafka cloud-native, serverless, and able to power robust and scalable solutions for Confluent Cloud customers. Learn more about Vineet’s experience and growth at Confluent and how his team and environment have set him up for success.

Using Webhooks to Integrate Confluent Cloud and Microsoft Teams

Data streaming equips modern organizations to rapidly ingest and understand new information and use it to solve real-world problems at scale. For some of these real-time insights—critical operational cues that demand a timely response—delivering that information directly to your team’s inbox is the best way to act on it.

How to Protect PII in Apache Kafka With Schema Registry and Data Contracts

A data contract is a formal agreement between an upstream component and a downstream component on the structure and semantics of data that’s in motion. In a previous post, I showed how Confluent Schema Registry supports data contracts. By combining data contracts and encryption on streaming workloads, you can shift left the responsibility of data consistency, quality, and security to the producer, allowing the consumer to depend on a trustworthy stream of data.

Demo: Streaming Agents automate competitive pricing in real time

Streaming Agents enable you to build, deploy, and orchestrate event-driven agents natively on Apache Flink and Apache Kafka. By unifying stream processing and agentic AI workflows, they leverage fresh context to continuously monitor and act on what’s happening in the business. In this demo, Brenner Heintz, Staff Technical Marketing Manager at Confluent, shows how to build agents that automate real-time competitive price matching on sales orders.

New in Confluent Cloud: Unleashing Cost-Effective Streaming for Any Workload

Streaming at scale just got a lot more powerful and cost-effective. Our Q3 Confluent Cloud launch is packed with innovations to help you do more with less: reduce cloud networking costs while maintaining your security posture, scale effortlessly with boosted connection limits, and build production-ready agentic artificial intelligence (AI) applications with seamless tool integrations.

Unleash Real-Time Agentic AI: Introducing Streaming Agents on Confluent Cloud

As AI models become commoditized, the conversation is shifting from building smarter models to building data infrastructure that turns models into real business value. Enterprises are accelerating their adoption of agentic AI—systems that don’t just predict but plan, decide, and act autonomously—across their software and operations.

Real-Time AI Agents Powered by Apache Kafka, Apache Flink, and Google Cloud

Discover how developers and data teams can build agentic AI applications with the combined power of Google Cloud AI services and Confluent Cloud’s real-time data streaming platform. This video showcases the joint value of integrating Apache Kafka, Confluent Schema Registry, Kafka Connect, and Apache Flink to enable seamless, real-time communication between AI agents. What you’ll learn: With Confluent and Google Cloud, you can go beyond AI experiments to build scalable, enterprise-ready multi-agent systems powered by real-time data.

How Confluent Is Enhancing and Easing Migration to Fully Managed Connectors

Taking advantage of Confluent’s pre-built connectors means you’re able to build integration pipelines without having to write, test, and maintain integration code. And with our fully managed connectors on Confluent Cloud, you can connect to any source or sink system with zero ops or infrastructure management and with cost-efficiency. This year, we've been hard at work making the benefits of fully managed connectors a reality for more of our customers.

How Confluent Helps Software Providers Build Real-Time Products and SaaS Faster

Bringing real-time capabilities to your product or software-as-a-service (SaaS) is no longer a nice-to-have; it’s a competitive necessity. Whether you're building a real-time payment platform, a patient monitoring system, or any product where instant data processing fuels great user experiences and artificial intelligence (AI)-driven innovation, Apache Kafka data streaming is likely at the core.

Announcing the Confluent Cloud Fully Managed Sink Connector for ClickHouse

Data is in motion, and it’s moving faster than ever. For developers and data architects building modern real-time data platforms, the ability to get data from anywhere and analyze it instantly is a superpower. That’s why we’re excited to announce a major step forward in this journey: the fully managed sink connector for Clickhouse, now generally available on Confluent Cloud.

From Oracle to MongoDB: How to Modernize Your Tech Stack for Real-Time AI Decisioning

Playlists for every mood and occasion. Media recommendations grouped by the most niche theme from your watch history. Sophisticated ad algorithms that optimize pay-per-click ads for the customer experience. Whether you call them digital-native, disruptors, or just tech giants, the likes of Spotify, Netflix, and Amazon have long made uncannily personal experiences a key part of their differentiation or business models.

Introducing Private Network Interface: Secure Private Networking on AWS for 50% Less

This is the second post in our series exploring the architectural innovations that make Confluent Cloud more cost-effective at scale. Building on our previous post about the operational complexities of Apache Kafka and our cloud-native architecture's solutions, we'll now dive into how we solved a core challenge for any data streaming workload: high cloud networking costs.