Systems | Development | Analytics | API | Testing

Streaming Data Integration with Apache Kafka

Data streaming with events supports many different applications and use cases. Event-driven microservices use data streaming, allowing companies to build applications based on domain-driven designs. This approach allows teams to break applications into composable microservices that can be worked on independently, speeding development. These designs scale well and can process huge amounts of data efficiently.

Cloud API Keys vs Resource-Specific API Keys in Confluent Cloud

As you build and manage data streams in Confluent Cloud, securing your interactions with its APIs is paramount. Confluent Cloud offers two types of API keys that manage authentication to the different APIs in Confluent Cloud: cloud API keys and resource-specific API keys. Each has its own distinct characteristics and use cases.

Empowering Customers: The Role of Confluent's Trust Center

The foundation of every successful customer relationship is trust. At Confluent, we understand that for our customers and prospects to innovate with confidence, they must have complete trust in the security and integrity of our platform. Our commitment goes beyond simply providing a secure product. It’s about empowering our customers with the tools and transparency they need to feel confident in their data streaming architectures.

2026 Predictions: What's Next for Data Streaming and AI | Life Is But A Stream

AI isn’t just evolving—it’s reshaping who your customers are, how systems operate, and what real time really means. From machines making purchase decisions to agents increasing query volume across databases, the realities of 2026 are forcing leaders to rethink data architecture and governance strategies at a fundamental level. In this episode, Joseph is joined by Will LaForest (Field CTO, Confluent), Adi Polak (Director of Developer Advocacy & Experience, Confluent), and independent analyst, Sanjeev Mohan, to break down critical insights from Confluent’s 2026 Predictions Report.

Starting With Purpose: In-Person Onboarding in a Remote-First World

The hardest part about remote work is building real connection and purpose when everyone is not in the same room. At Confluent, we know flexibility is essential, but we also know that great work and a sense of belonging don’t just happen; they take effort. That’s why we’re intentional about how we bring people together, starting from day one.

Thunai Automates Customer Support with AI Agents and Data Streaming

Support teams live in a world of repetitive questions, fragmented tools, and growing customer expectations. Customer service agents bounce between customer relationship management (CRM) systems, ticketing, email, and chat while customers wait, often repeating the same information across channels. Batch-based systems are unscalable for AI: Context is always a step behind, escalations pile up, and it’s difficult to intervene in time.

Confluent Cloud Is Your Life (K)Raft Away From Hosted Apache Kafka

Streaming your data with Apache Kafka, at its core, involves moving data from one point to another in real time, much like a river flows from its source to its destination. However, beneath this seemingly straightforward goal lies significant complexity and hidden costs. The multitude of available deployment options, hosted and managed Kafka services, and design choices make it difficult to navigate the data streaming landscape.

How to Build a Custom Kafka Connector - A Comprehensive Guide

In today’s data-driven world, seamless data integration is crucial to ensuring the smooth operation of modern systems. With the growing complexity of distributed data platforms, businesses and developers are seeking efficient ways to move, process, and transform data. Apache Kafka has become the de facto standard for real-time data streaming, and Kafka Connect plays a key role in facilitating the integration of Kafka with various data sources and sinks.