Systems | Development | Analytics | API | Testing

How the Singapore Government is Building Agility to Enhance Citizen Services with IMDA's Tech Acceleration Lab and the Government Commercial Cloud+

Supply chain disruption. Pandemics and medical crises. Climate change. Events that were once “black swans” are now near-everyday occurrences for citizens worldwide—and governments are facing growing pressure to provide effective public services in response. In the past, the ability to access, process, and apply data to make informed decisions about complex issues was crucial to creating effective public services.

Optimizing Serverless Stream Processing with Confluent Freight Clusters and AWS Lambda

Confluent has been instrumental in enabling customers from various industries to develop real-time stream processing solutions using Apache Kafka. While many of these use cases demand low-latency and real-time processing, stream processing is also increasingly being utilized for ingesting logging and telemetry data. This type of data typically features a high ingest rate, but allows for a higher tolerance for end-to-end processing time.

Powering AI Agents with Real-Time Data Using Anthropic's MCP and Confluent

Model Context Protocol (MCP), introduced by Anthropic, is a new standard that simplifies artificial intelligence (AI) integrations by providing a secure, consistent way to connect AI agents with external tools and data sources. When we saw MCP’s potential, we immediately started exploring how we could bring real-time data streaming into the mix. With our long history of supporting open source and open standards, building an MCP server was a natural fit.

New with Confluent Platform 7.9: Oracle XStream CDC Connector, Client-Side Field Level Encryption (EA), Confluent for VS Code, and More

At Confluent, we’re committed to building the world's leading data streaming platform, which gives you the ability to stream, connect, process, and govern all of your data, and make it available wherever it’s needed—however it’s needed—in real time. Today, we're excited to announce the release of Confluent Platform 7.9! This release builds upon Apache Kafka 3.9, reinforcing our core capabilities as a data streaming platform.

New in Confluent Cloud: Tableflow, Freight Clusters, Apache Flink AI Enhancements, and More

Our Q1 Confluent Cloud launch comes to you from Current Bengaluru, where data streaming industry experts have gathered to showcase how real-time streaming with Apache Kafka, Apache Flink, and Apache Iceberg is enabling generative artificial intelligence (AI) use cases and helping their organizations take innovation to the next level.

The Confluent Q1 '25 Launch

The Confluent Q1 ’25 Launch includes Freight clusters and Tableflow, the features needed to make streaming data cheaper and feeding your data lake easier. We’re also introducing AI tools for Apache Flink, VS Code Extension, a new premium Oracle CDC Connector, and more! Our quarterly launches provide a single resource to learn about the accelerating number of new features we’re bringing to Confluent Cloud, our cloud-native data streaming platform.

Tableflow: Represent Kafka topics as Apache Iceberg or Delta Lake tables in a few clicks

Tableflow represents Kafka topics and associate schemas as open-table formats such as Apache Iceberg(Generally Available) or Delta Lake (Early Access) in a few clicks to feed any data lake, warehouse, or analytics engine. It removes the need for complex, costly and error-prone data pipelines that are currently being used to feed streaming data to data lakes while delivering strong read performance and automated data maintenance. Customers can also bring their own storage ensuring flexibility, cost savings, and security.

Ep 4 - From Legacy to Cutting-Edge: Henry Schein One's Data Streaming Vision

Despite its value, legacy data can feel like a roadblock in a fast-paced digital world—Henry Schein One is clearing the path forward with real-time data streaming. In this episode, Chris Kapp, Software Architect at Henry Schein One (HS1), shares how his team modernizes data management to stay competitive and unlock real-time insights. You’ll learn about: Get ready to future-proof your data strategy with this must-listen episode for technology leaders facing scalability, governance, or integration challenges.

Real-Time Toxicity Detection in Games: Balancing Moderation and Player Experience

Toxic player behavior will drive players away from an otherwise successful game. While friendly banter and trash talk can build camaraderie, context matters—as what’s okay among longtime friends might be harassment to a stranger. How can studios create a positive atmosphere for all while still allowing friends to have a little fun at each other’s expense?

The Business Value of the DSP: Part 1 - From Apache Kafka to a DSP

In our 2024 Data Streaming Report, we surveyed 4,110 IT leaders about the pivotal role that data streaming plays in their businesses. Of the respondents, 91% indicated that data streaming platforms (DSPs) are critical or important for achieving their data-related objectives—a result that comes as no surprise. Most IT teams—from architecture and integration teams to application and data engineers—that are using Confluent’s DSP get it.

The Business Value of the DSP: Part 2 - A Framework for Measuring Impact

In our previous blog post, we explored how Confluent has evolved into a comprehensive data streaming platform (DSP). Now that we understand what a DSP is, let's address a key question: How does it deliver business value? When business stakeholders assess value, they typically look at how a solution might help drive benefits across three key areas: So how does a DSP support these key areas?

Data-Driven Business Agility: Adapting to Market Changes in Real Time

Companies need to adapt quickly to stay ahead of their competitors. This is where data-driven agility becomes essential. By leveraging real-time data, businesses can immediately respond to market changes and confidently make informed decisions. This article will explain data-driven agility, how it works, and why it’s a valuable approach for any organization. At its core, data-driven agility involves using live data to predict and respond to changes.

Data Accessibility: The Key to a Data-Driven Business Strategy

Easy access to data is a key part of any successful business strategy. Accessing, sharing, and analyzing data quickly helps organizations make smarter decisions, streamline operations, and stay competitive. However, many businesses face a significant challenge: they collect vast amounts of data from different sources yet often lack the right tools, processes, or infrastructure to make that data easy to access and use across the company.

How Data-Driven Decision-Making Fuels Competitive Advantage

Relying on intuition alone isn’t enough to stay ahead of the game in today’s fast-paced business environment. Success now comes from smart, data-driven decision-making backed by real-time insights and analytics. With stream processing and modern analytics platforms, businesses can collect, process, and analyze information as it happens, giving them a clear edge.

Streaming Data Fuels Real-time AI & Analytics: Connect with Confluent Q1 Program Entrants

In today’s fast-moving digital economy, organizations need real-time intelligence to power AI, analytics, and increasingly fast paced decision-making. But to successfully deploy AI and advanced analytics, businesses must operate on trusted, up-to-date data streams that provide an accurate picture of what’s happening right now.

Protect Your Data With Self-Managed Keys (BYOK) Enhancements

In today’s rapidly evolving data security landscape, it’s critical for organizations to secure their services, particularly in the face of rising cyber threats. Robust security measures for streaming data are vital to safeguard against breaches and losses, and help to maintain trust among customers and partners, while ensuring compliance with regulatory requirements.

Flink AI: Hands-On FEDERATED_SEARCH()-Search a Vector Database with Confluent Cloud for Apache Flink

With the advent of modern Large Language Models (LLMs), Retrieval Augmented Generation (RAG) has become a de-facto technology choice, employed to extract insights from a variety of data sources using natural language queries. RAG combined with LLMs presents many new possibilities for integrating Generative AI capabilities within existing business applications, specifically opening up many new use cases within the data streaming and analytics space.

Ep 3 - The Connective Tissue: Shift Left to turn Data Chaos to Clarity

In the final episode of our 3-part series on the basics of data streaming, we take a deep dive into data integration—covering everything from data governance to data quality. Our guests, Mike Agnich, General Manager of Data Streaming Platform, and David Araujo, Director of Product Management at Confluent, explain why connectors are must-haves for integrating systems.