Systems | Development | Analytics | API | Testing

Confluent Cloud is now available in the new AWS Marketplace AI Agents and Tools category

Confluent announces the availability of Confluent Cloud in the new AI Agents and Tools category of AWS Marketplace. This enables AWS customers to easily discover, buy, and deploy AI agent solutions, including Confluent's fully managed data streaming platform Confluent Cloud, using their AWS accounts, for accelerating AI agent and agentic workflow development.

Building Streaming Data Pipelines, Part 2: Data Processing and Enrichment With SQL

In my last blog post, I looked at the essential first part of building any data pipeline—exploring the raw source data to understand its characteristics and relationships. The data is information about river levels, rainfall, and other weather information provided by the UK Environment Agency on a REST API. I used the HTTP Source connector to stream this into Apache Kafka topics (one per REST endpoint), and then Tableflow to expose these as Apache Iceberg tables.

How Confluent Helps You Deliver New Customer Experiences and Act on Insights Faster

What’s your AI strategy? How are you making your service as easy to use as those from major cloud companies? How quickly can you roll out new features to customers? These are common questions technology executives face on a daily basis. They must accelerate innovation across the board––from deploying new features and making decisions to generating revenue and saving money.

Why Hosted Apache Kafka Leaves You Holding the Bag

Many teams begin their data streaming journey through their cloud provider, drawn to the simplicity of the one‑click “Create Kafka Cluster” button in their cloud console. It’s fast, feels integrated, and promises to “just work”—abstracting away all the operational tasks that only get more complicated in the cloud.

Unlocking Real-Time Analytics on AWS With Tableflow, Apache Iceberg, and the AWS Glue Data Catalog

In today's competitive landscape, data warehouses and data lakes are the essential platforms for business intelligence, analytics, and AI. While immensely powerful, these systems were traditionally designed for batch data processing, often leading to insights based on data that is hours or even days old. The primary challenge has always been the complexity of bridging the gap between real-time data streams, typically flowing through Kafka, and these analytical systems.

Unlock the Power of Your Data Warehouse: Introducing the Snowflake Source Connector for Confluent Cloud

Organizations have mastered collecting and storing vast amounts of data in cloud data warehouses like Snowflake. This central repository has become the single source of truth for analytical insights, business intelligence, and reporting. However, the true potential of this data remains trapped if it's confined to the warehouse, creating a disconnect between rich analytical insights and real-time operational systems.

Event-Driven AI Agents: Why Flink Agents Are the Future of Enterprise AI

The evolution of artificial intelligence (AI) in the enterprise has reached an inflection point. While the early days of generative AI focused on chatbots responding to human prompts, today's enterprise AI agents are fundamentally different—they're event-driven, autonomous systems that continuously process streams of business data, make real-time decisions, and take actions at scale.

From Pawns to Pipelines: Stream Processing Fundamentals Through Chess

We understand new concepts by linking them to familiar ones. These analogies aren’t just helpful; they’re how we think. For me, that something familiar is chess, and I’ll use it to explain some of the core ideas behind stream processing—a concept that requires a shift from seeing tables as static snapshots to treating tables as materialized projections of a continuous stream of changes.

Confluent and Amazon EventBridge for Broad Event Distribution

Confluent has established itself as a leader in event streaming, providing not only a robust platform but also a rich portfolio of pre-built connectors. These connectors act as bridges, effortlessly channeling data between a multitude of systems, from databases and applications to cloud services. This extensive portfolio empowers users to weave together their data landscapes with remarkable ease and flexibility.