Systems | Development | Analytics | API | Testing

Confluent Connect: FY'25 Launch Highlights - Unlocking Data & Powering AI Pipelines

Dive into the biggest breakthroughs for the Confluent Connect ecosystem in 2025! This year, we made moving data easier than ever, from modernizing legacy systems with the Oracle XStream CDC Premium Connector to empowering developers with Custom SMTs and Custom Connectors on Google Cloud. Discover the over 10 new connectors we launched, including Snowflake Source, Azure Cosmos DB v2, and Neo4j Sink, plus the release of Confluent Hub 2.0. Learn how Confluent Cloud connectors are breaking down silos and building bridges for your next-gen AI and data modernization projects.

Why Managing Your Apache Kafka Schemas Is Costing You More Than You Think

For developers building event-driven systems, schemas are essential for using schemas to define data contracts between producers and consumers in Apache Kafka, ensuring every message can be correctly interpreted. But when schema management is handled manually or through do-it-yourself (DIY) solutions, organizations face escalating expenses that compound as their deployments scale.

Confluent Recognized in 2025 Gartner Magic Quadrant for Data Integration Tools

We are pleased to announce that Confluent has been recognized again as a Challenger in the 2025 Gartner Magic Quadrant for Data Integration Tools. We believe this recognition validates the scale and reliability of our platform, acknowledging our "Ability to Execute" in powering the mission-critical data flows of the world's largest organizations.

Why Cluster Rebalancing Counts More Than You Think in Your Apache Kafka Costs

Cluster rebalancing is the redistribution of partitions across Kafka brokers to balance workload and performance. While this task is a necessary and frequent part of routine Apache Kafka operations, its true impact on infrastructure stability, resource consumption, and cloud expenditures is often underestimated.

Apache Kafka Monitoring Is Costing You More Than You Think

For organizations that rely on Apache Kafka, monitoring capabilities aren’t just a "nice-to-have"—it's a fundamental requirement for reliable performance in production and business continuity. However, the true cost of monitoring Kafka is often misunderstood. It’s not a single line item on a bill but a collection of hidden expenses that silently drain your engineering budget and inflate your total cost of ownership (TCO).

Cost to Build a Data Streaming Platform: TCO, Risks, and Alternatives

For many organizations, the decision to adopt a data streaming architecture is a strategic imperative—critical for driving everything from instant personalization to global fraud detection. The question is no longer if they should stream, but how. This leads directly to a critical, often underestimated, financial calculation: the cost to build a data streaming platform (DSP) in-house versus the cost of subscribing to a managed service. Let’s explore key considerations in the "build vs.

2026 Data & AI Predictions: What Trends Will Shape the Future?

We recently released our 2026 Confluent Predictions Report, outlining bold ideas and trends that are shaping the future of data, AI, and real-time systems. And stay tuned for an upcoming episode of the Life Is But a Stream web show that will air early in the new year. Join the conversation as host Joseph Morais sits down with Sanjeev Mohan, independent analyst at SanjMo, for an exciting roundtable discussion breaking down those predictions. Are they forecasts? Are they trends? And which ones will matter most as we move forward into 2026?

Inside Life Is But A Stream: A Year in Review (Real-Time Data and AI) | Confluent Podcast

Technology leaders share how they process streaming events end-to-end in sub-second latency, secure sensitive data through immutable and governed pipelines, power secured real-time systems in regulated environments, and enable multi-agent AI systems built on fresh, continuously streaming data.

Best of 2025: Everything You Missed in 15 Minutes | Life Is But A Stream

In 2025, real-time data moved from important to mission-critical. From billion-event pipelines to AI agents reasoning over fresh data streams, the conversations this year revealed just how fast organizations are rethinking their data infrastructure and what’s possible when data streaming becomes the foundation.

IBM to Acquire Confluent

We are excited to announce that Confluent has entered into a definitive agreement to be acquired by IBM. After the transaction is closed (subject to customary closing conditions and regulatory approvals), together, IBM and Confluent will aim to provide a platform that unifies the world’s largest enterprises, unlocking data for cloud/microservices, accelerating time-to-value, and building the real-time data foundation required to scale AI across every organization.

Simplify Real-Time Context Engineering for Snowflake Intelligence With Confluent

Snowflake Intelligence brings enterprise insights to every employee’s fingertips, helping users answer complex questions in natural language with their own personalized enterprise intelligence agent. But for these agents to deliver truly accurate, contextually aware results within Snowflake’s Cortex framework, they need more than access to static or batch data—they need a continuous, trustworthy view of everything happening across the business right now.

Empowering the Data Streaming Ecosystem: Evolving Confluent Hub to Confluent Marketplace

Today marks a monumental step in our commitment to fueling the growth, reach, and impact of our global partner network. We’re thrilled to announce the official launch of Confluent Marketplace (formerly Confluent Hub), a centralized resource designed to accelerate innovation, drive connectivity, and dramatically simplify the developer experience within the data streaming landscape. For years, integration engineers have been the quiet force behind the modern digital world.

Connecting the Dots: Simplifying Multi-API Data Flows into Apache Kafka

In today’s data-driven software-as-a-service (SaaS) environments, the need for complete customer insights often requires fetching and sharing data that lives across multiple API endpoints. That’s why many of our customers want to use Confluent’s data streaming and integration capabilities to implement real-time API chaining—a technique that allows them to automatically follow relationships between APIs.