Systems | Development | Analytics | API | Testing

Kafka

Use GitOps as an efficient CI/CD pipeline for Data Streaming | Data Streaming Systems

Early automation saves time and money. GitOps improves CI/CD pipeline, enhancing operations & traceability. Learn to use GitOps for data streaming platforms & streaming applications with Apache Kafka and Confluent Cloud.

Robust Disaster Recovery with Kafka and Confluent Cloud | Data Streaming Systems

Explore the resilience of Kafka, understand the implications of datacenter disruptions, and mitigate data loss impacts. Learn to scale with Confluent Cloud, cluster and schema linking, and how to use an active/passive disaster recovery pattern for business continuity.

Challenges Using Apache Kafka | Data Streaming Systems

Streaming platforms need key capabilities for smooth operations: data ingestion, development experience, management, security, performance, and maintenance. Self-managed platforms like Apache Kafka can meet these needs, but can be costly and require intensive maintenance. On the other hand, Confluent Cloud offers fully-managed services with features like scalable performance, auto-balancing, tiered storage, and enhanced security and resiliency. It provides systematic updates and maintenance, freeing users from infrastructure concerns. Confluent Cloud streamlines creation of a global, well-governed data streaming platform.

How DISH Wireless Benefits From a Data Mesh Built With Confluent

"Over the last few years, DISH Wireless has turned to AWS partners like Confluent to build an entirely new type of telecommunication infrastructure—a cloud-native network built to empower developers. Discover how data streaming allows DISH Wireless to:— Deliver data products that turn network data into business value for customers— Harness massive volumes of data to facilitate the future of app communications— Seamlessly connect apps and devices across hybrid cloud environments.

Top 5 Best Practices for Building Event-Driven Architectures Using Confluent and AWS Lambda

Confluent and AWS Lambda can be used for building real-time, scalable, fault-tolerant event-driven architectures, ensuring that your application logic is executed reliably in response to specific business events. Confluent provides a streaming SaaS solution based on Apache Kafka® and built on Kora: The Cloud Native Apache Kafka Engine, allowing you to focus on building event-driven applications without operating the underlying infrastructure.

Apache Kafka Message Compression

Apache Kafka® supports incredibly high throughput. It’s been known for feats like supporting 20 million orders per hour to get COVID tests out to US citizens during the pandemic. Kafka's approach to partitioning topics helps achieve this level of scalability. Topic partitions are the main "unit of parallelism" in Kafka. What’s a unit of parallelism? It’s like having multiple cashiers in the same store instead of one.

Dataflow Programming with Apache Flink and Apache Kafka

Recently, I got my hands dirty working with Apache Flink®. The experience was a little overwhelming. I have spent years working with streaming technologies but Flink was new to me and the resources online were rarely what I needed. Thankfully, I had access to some of the best Flink experts in the business to provide me with first-class advice, but not everyone has access to an expert when they need one.