Systems | Development | Analytics | API | Testing

Data Streaming

How Confluent Fuels Gen AI Chat Models with Real-Time Data

Discover how GEP, an AI-powered procurement company, utilized Confluent's data streaming platform to transform its generative AI capabilities. Integrating real-time data into their AI models enabled GEP to provide a contextual chat-based service. This chatbot allowed GEP customers to build their own tools simply by communicating in English with a chatbot.

Replication in Apache Kafka Explained | Monitoring & Troubleshooting Data Streaming Applications

Learn how replication works in Apache Kafka. Deep dive into its critical aspects, including: Whether you're a systems architect, developer, or just curious about Kafka, this video provides valuable insights and hands-on examples. Don't forget to check out our GitHub repo to get all of the code used in the demo, and to contribute your own enhancements.

How Booking.com Used Data Streaming to Put Travel Decisions into Customer's Hands

Booking.com wanted to give people a “connected trip” experience, allowing customers to seamlessly book flights, accommodations, car rentals, and excursions in one visit. The company realized the value of data streaming early on in reaching this goal, but the operational effort had become overwhelming. Learn how Booking.com found the answer in Confluent’s data streaming platform. With its automated configuration that required no ongoing maintenance, the team was able to prioritize innovation with data and provide the comprehensive booking experience they had been searching for.

How to source data from AWS DynamoDB to Confluent using the Open-Source Connector

This is a one-minute video showing an animated architectural diagram of an integration between Amazon DynamoDB and Confluent Cloud using an open-source Kafka connector. The integration allows you to avoid maintaining custom code, and gives you the ability to automatically discover and adapt to changes in DynamoDB tables. All details are provided.

How to source data from AWS DynamoDB to Confluent using Kinesis Data Streams and Connect

This is a one-minute video showing an animated architectural diagram of an integration between Amazon DynamoDB and Confluent Cloud using Kinesis Data Streams and the Kinesis Data Streams connector. It’s a fully managed and serverless solution that reduces operational complexity and leverages scalability and cost-effectiveness.