Systems | Development | Analytics | API | Testing

Latest Videos

How to Visualize Real-Time Data from Apache Kafka using Apache Flink SQL and Streamlit

Data visualization is cool, but have you tried setting up a chart of real-time data? In this video, Lucia Cerchie shows you how to create a live visualization of market data. She starts by producing data to a topic in Confluent Cloud from an Alpaca API websocket, then processes that data with Flink SQL, and finally uses a Streamlit component for a real-time visualization.

What Made Current 2024 Unforgettable? Hear From Our Attendees | Current 2024

In this recap video from Current 2024, attendees share their favorite moments from the event. From insightful talks on data streaming innovation to hands-on workshops and networking opportunities, hear what participants found most valuable.

Windowing with Table-Valued Functions | Apache Flink SQL

Apache Flink SQL makes it easy to implement analytics that summarize important attributes of real-time data streams. There are four different types of time-based windows in Flink SQL: tumbling, hopping, cumulating, and session. Learn how these various window types behave, and how to work with the table-valued functions that are at the heart of Flink SQL’s support for windowing.

How Thrivent Uses Real-Time Data for AI-Driven Fraud Detection

In today’s fast-paced financial services landscape, customers have a shorter attention span than ever. To meet clients’ growing demands for real-time access to information and keep innovating in areas like fraud detection and personalized financial advice, Thrivent needed to overhaul its data infrastructure. With data scattered across siloed legacy systems, diverse tech stacks, and multiple cloud environments, the challenge was a bit daunting. But by adopting Confluent Cloud, Thrivent was able to unify its disparate data systems into a single source of truth.

Why Real-Time Data is Crucial to Developing Generative AI Models

Learn how GEP, an AI-powered supply chain and procurement company, harnesses real-time data streaming through Confluent Cloud to fuel its generative AI solutions. With seamless integration into Azure OpenAI services and GPT models, GEP’s generative AI chatbot delivers document summaries and risk management insights to its customers.

How Confluent Fuels Gen AI Chat Models with Real-Time Data

Discover how GEP, an AI-powered procurement company, utilized Confluent's data streaming platform to transform its generative AI capabilities. Integrating real-time data into their AI models enabled GEP to provide a contextual chat-based service. This chatbot allowed GEP customers to build their own tools simply by communicating in English with a chatbot.

Replication in Apache Kafka Explained | Monitoring & Troubleshooting Data Streaming Applications

Learn how replication works in Apache Kafka. Deep dive into its critical aspects, including: Whether you're a systems architect, developer, or just curious about Kafka, this video provides valuable insights and hands-on examples. Don't forget to check out our GitHub repo to get all of the code used in the demo, and to contribute your own enhancements.

How Booking.com Used Data Streaming to Put Travel Decisions into Customer's Hands

Booking.com wanted to give people a “connected trip” experience, allowing customers to seamlessly book flights, accommodations, car rentals, and excursions in one visit. The company realized the value of data streaming early on in reaching this goal, but the operational effort had become overwhelming. Learn how Booking.com found the answer in Confluent’s data streaming platform. With its automated configuration that required no ongoing maintenance, the team was able to prioritize innovation with data and provide the comprehensive booking experience they had been searching for.

How to source data from AWS DynamoDB to Confluent using the Open-Source Connector

This is a one-minute video showing an animated architectural diagram of an integration between Amazon DynamoDB and Confluent Cloud using an open-source Kafka connector. The integration allows you to avoid maintaining custom code, and gives you the ability to automatically discover and adapt to changes in DynamoDB tables. All details are provided.