AI And Real-World Data: A New Era For Identifying And Curing Rare Diseases

In this episode of the "Data Cloud Podcast," Dana Gardner is joined by Chandi Kodthiwada, Vice President of Product Management at Komodo Health, to explore how Komodo Health utilizes vast and disparate data sources to generate unprecedented insights in life sciences and healthcare. They discuss the founding mission of Komodo Health, the challenges of building a comprehensive, de-identified data set, and AI’s role in reducing the burden of disease and improving patient outcomes.

Empowering the Data Streaming Ecosystem: Evolving Confluent Hub to Confluent Marketplace

Today marks a monumental step in our commitment to fueling the growth, reach, and impact of our global partner network. We’re thrilled to announce the official launch of Confluent Marketplace (formerly Confluent Hub), a centralized resource designed to accelerate innovation, drive connectivity, and dramatically simplify the developer experience within the data streaming landscape. For years, integration engineers have been the quiet force behind the modern digital world.

Fat Fingers? Not With Our K2K Config Schema Protector!

Picture this: It's 3 AM. You’re on-duty in case there is an outage. A team in the other part of the world merged PR and released a new version of K2K Replicator and it crashed. Consumer group lag is spiking to the universe. You’re paged & woken up, went to your laptop, the team already reverted PR, things are stabilising, but what really happened, you have to investigate now as postmortem has to be done.

Supercharging Qlik Open Lakehouse: Now Streaming, Trusted, Open, and AI-Ready

Earlier this year at Qlik Connect, we introduced Qlik Open Lakehouse, a fully managed, Apache Iceberg–based platform designed to make it easy and cost-effective for organizations to ingest, optimize, and manage data in open lakehouse architectures. And the first version of Qlik Open Lakehouse is generally available as of Sept 2025.

How to assess data lake and data warehouse migrations to BigQuery

Embarking on a data lake or data warehouse migration to BigQuery can seem daunting, but a thorough assessment helps clarify the path forward. This video introduces Google Cloud's services and expert guidance for evaluating the cost and complexity of migrating your existing systems, providing a clear plan for your migration journey. Discover how initial assessments help estimate time, costs, and identify the best approach for a successful migration from Snowflake, Teradata, Cloudera, Databricks and more.

Connecting the Dots: Simplifying Multi-API Data Flows into Apache Kafka

In today’s data-driven software-as-a-service (SaaS) environments, the need for complete customer insights often requires fetching and sharing data that lives across multiple API endpoints. That’s why many of our customers want to use Confluent’s data streaming and integration capabilities to implement real-time API chaining—a technique that allows them to automatically follow relationships between APIs.