Systems | Development | Analytics | API | Testing

Confluent

86% of IT leaders say data streaming is a priority for IT investment in 2024

Confluent survey: 90% of respondents say data streaming platforms can lead to more product and service innovation in AI and ML development. 86% of respondents cite data streaming as a strategic or important priority for IT investments in 2024. For 91% of respondents, data streaming platforms are critical or important for achieving data-related goals.

How to Analyze Data from a REST API with Flink SQL

Join Lucia Cerchie in a coding walkthrough, bridging the gap between REST APIs and data streaming. Together we’ll transform the OpenSky Network's live API into a data stream using Kafka and Flink SQL. Not only do we change the REST API into a data stream in this walkthrough, but we clean up the data on the way! We use Flink SQL to make it more readable and clean, and in that way we keep more of the business logic away from the client code.

Capital One Shares Insights on Cloud-Native Streams and Governance

Businesses that are best able to leverage data have a significant competitive advantage. This is especially true in financial services, an industry in which leading organizations are in constant competition to develop the most responsive, personalized customer experiences. Often, however, legacy infrastructure, data silos, and batch systems introduce significant technical hurdles.

How to use Flink SQL, Streamlit, and Kafka: Part 1

Market data analytics has always been a classic use case for Apache Kafka. However, new technologies have been developed since Kafka was born. Apache Flink has grown in popularity for stateful processing with low latency output. Streamlit, a popular open source component library and deployment platform, has emerged, providing a familiar Python framework for crafting powerful and interactive data visualizations. Acquired by Snowflake in 2022, Streamlit remains agnostic with respect to data sources.

Defining Asynchronous Microservice APIs for Fraud Detection | Designing Event-Driven Microservices

In this video, Wade explores the process of decomposing a monolith into a series of microservices. You'll see how Tributary bank extracts a variety of API methods from an existing monolith. Tributary Bank wants to decompose its monolith into a series of microservices. They are going to start with their Fraud Detection service. However, before they can start, they first have to untangle the existing code. They will need to define a clean API that will allow them to move the functionality to an asynchronous, event-driven microservice.

Solving the Dual-Write Problem: Effective Strategies for Atomic Updates Across Systems

The dual-write problem occurs when two external systems must be updated in an atomic fashion. A classic example is updating an application’s database while pushing an event into a messaging system like Apache Kafka. If the database update succeeds but the write to Kafka fails, the system ends up in an inconsistent state. However, the dual-write problem isn’t unique to event-driven systems or Kafka. It occurs in many situations involving different technologies and architectures.

Retrieval Augmented Generation (RAG) with Data Streaming

How do you prevent hallucinations from large language models (LLMs) in GenAI applications? LLMs need real-time, contextualized, and trustworthy data to generate the most reliable outputs. Kai Waehner, Global Field CTO at Confluent, explains how RAG and a data streaming platform with Apache Kafka and Flink make that possible.

Data Streaming Awards 2024: Nominations Are Now Open

The Data Streaming Awards is back for its third year! Designed to bring the data streaming community together, this one-of-a-kind industry award event recognizes organizations that are harnessing the power of this revolutionary technology to drive business and customer experience transformation. If you know a company (even your own team) that is using data streaming technology to transform their business and provide amazing value to their customers and communities, the time is now to submit a nomination.

Best Practices for Confluent Terraform Provider

Managing Confluent Cloud infrastructure efficiently poses challenges due to the complexities involved in deploying and maintaining various components like environments, clusters, topics, and authorizations. Without proper tooling and practices, teams struggle with manual configuration errors, lack of consistency, and potential security risks. The Confluent Terraform.