Today, Confluent announced a $200 million commitment over the next three years designed to supercharge the growth and impact of its global partner ecosystem. This investment expands on Confluent’s partner-centric strategy; well over 20%* of the data streaming pioneer’s business in the past year has been partner-sourced, underscoring the ecosystem’s vital role in unlocking real-time use cases at scale.
Operational and analytical estates have been separated since data warehouses were first introduced in the 1990s. The operational estate includes microservices, software-as-a-service (SaaS) apps, and enterprise resource planning systems (ERPs) that have become the beating heart of an organization. The analytical estate consists of the data warehouses, lakehouses, artificial intelligence (AI)/machine learning (ML) platforms, and other custom batch workloads that support business analysis and reporting.
Businesses need to be able to respond quickly to macroeconomic uncertainty and changing market conditions, whether those are caused by trade disputes, public health crises like the COVID-19 pandemic, or simply abrupt changes in consumer demand. Amid that swirl of disruptions, technology leaders are also being pressured by management and investors to optimize their businesses and drive as much efficiency as possible.
In a highly regulated industry where every millisecond matters, Swedbank is shifting its data strategy to move fast without breaking trust. In this episode, Rami Al Lolah, Lead Architect at Swedbank’s Integration Center of Excellence, shares how the bank built a future-proof foundation using Apache Kafka with Confluent Cloud.
For Taohao, being a Senior Software Engineer means never standing still. Whether it’s picking up a new skill, diving deeper into a complex system, or exchanging ideas with teammates, he thrives in an environment where curiosity, innovation, and collaboration come together. Let’s find out what makes Confluent the place where engineers like Taohao aren’t just solving problems but are constantly learning, challenging each other, and building what’s next.
According to the 2025 Data Streaming Report, 44% IT leaders reap 5x or more ROI with data streaming. From powering AI initiatives to improving customer experience, find out how companies like @VictoriasSecret are driving myriad business benefits with easy access to real-time data.
A joint post from the teams at NeuBird and Confluent For organizations leveraging Confluent, ensuring smooth operations is mission-critical. While Confluent Cloud eliminates the operational burden of managing Apache Kafka, application teams still need to monitor and troubleshoot client applications connecting to Kafka clusters.
Confluent announces the availability of Confluent Cloud in the new AI Agents and Tools category of AWS Marketplace. This enables AWS customers to easily discover, buy, and deploy AI agent solutions, including Confluent's fully managed data streaming platform Confluent Cloud, using their AWS accounts, for accelerating AI agent and agentic workflow development.
In my last blog post, I looked at the essential first part of building any data pipeline—exploring the raw source data to understand its characteristics and relationships. The data is information about river levels, rainfall, and other weather information provided by the UK Environment Agency on a REST API. I used the HTTP Source connector to stream this into Apache Kafka topics (one per REST endpoint), and then Tableflow to expose these as Apache Iceberg tables.
What’s your AI strategy? How are you making your service as easy to use as those from major cloud companies? How quickly can you roll out new features to customers? These are common questions technology executives face on a daily basis. They must accelerate innovation across the board––from deploying new features and making decisions to generating revenue and saving money.
Many teams begin their data streaming journey through their cloud provider, drawn to the simplicity of the one‑click “Create Kafka Cluster” button in their cloud console. It’s fast, feels integrated, and promises to “just work”—abstracting away all the operational tasks that only get more complicated in the cloud.
In today's competitive landscape, data warehouses and data lakes are the essential platforms for business intelligence, analytics, and AI. While immensely powerful, these systems were traditionally designed for batch data processing, often leading to insights based on data that is hours or even days old. The primary challenge has always been the complexity of bridging the gap between real-time data streams, typically flowing through Kafka, and these analytical systems.
Get a peek at the key findings from the 2025 Data Streaming Report and what 4K+ IT leaders had to say about how they’re driving business value (think faster time to market and AI innovation) with data streaming platforms.
Organizations have mastered collecting and storing vast amounts of data in cloud data warehouses like Snowflake. This central repository has become the single source of truth for analytical insights, business intelligence, and reporting. However, the true potential of this data remains trapped if it's confined to the warehouse, creating a disconnect between rich analytical insights and real-time operational systems.
Is your data strategy keeping up with your AI ambitions? In this special episode, we unpack findings from Confluent’s 2025 Data Streaming Report—a global pulse check on how IT leaders (4,000+ to be specific) are using data streaming to drive innovation, faster time to market, and reduce costs and complexity.
This demo showcases a use case for a mortgage provider that leverages Confluent Cloud, Databricks, and AWS to fully automate mortgage applications—from initial submission to final decision and offer. New to Confluent? Experience unified Apache Kafka and Apache Flink with a free trial.
The evolution of artificial intelligence (AI) in the enterprise has reached an inflection point. While the early days of generative AI focused on chatbots responding to human prompts, today's enterprise AI agents are fundamentally different—they're event-driven, autonomous systems that continuously process streams of business data, make real-time decisions, and take actions at scale.
We understand new concepts by linking them to familiar ones. These analogies aren’t just helpful; they’re how we think. For me, that something familiar is chess, and I’ll use it to explain some of the core ideas behind stream processing—a concept that requires a shift from seeing tables as static snapshots to treating tables as materialized projections of a continuous stream of changes.
Confluent has established itself as a leader in event streaming, providing not only a robust platform but also a rich portfolio of pre-built connectors. These connectors act as bridges, effortlessly channeling data between a multitude of systems, from databases and applications to cloud services. This extensive portfolio empowers users to weave together their data landscapes with remarkable ease and flexibility.
Three in 4 programmers have tried artificial intelligence (AI). This factoid comes from a recent Wired survey on the habits of engineers with respect to AI tooling like GitHub Copilot. Though Wired used a pool of only around 700 engineers, Gartner’s prediction from a year ago was that 75% of enterprise software engineers would use AI by 2028. To many of us, it’s starting to feel like that’s already happened.