Systems | Development | Analytics | API | Testing

Kafka

Introducing Confluent's JavaScript Client for Apache Kafka

From humble beginnings, Apache Kafka steadily rose to prominence and now sits as the backbone of data streaming for thousands of organizations worldwide. From its robust API, cloud-native implementations like Confluent Cloud, and synergy with other technologies like Apache Flink, Kafka has grown to cover many use cases across a broad range of industries.

Are You Misconfiguring Producer Retries? | Kafka Developer Mistakes

Producer retries in Apache Kafka can make or break message delivery, especially during broker events like updates or failures. Use the idempotent producer, and configure delivery timeouts, in order to avoid common pitfalls that lead to lost messages or broken ordering.

Are You Using the Wrong Partition Key? | Kafka Developer Mistakes

Picking the wrong partition key in Apache Kafka? That’s a fast track to performance headaches—think unbalanced loads, slowdowns, and broken message ordering. Choosing the right partitioning strategy keeps your data flowing smoothly and avoids hot partitions.

Why Short-Lived Connections Are Killing Your Performance! | Kafka Developer Mistakes

Constantly starting and stopping Apache Kafka producers and consumers? That’s a recipe for high resource usage and inefficiency. Short-lived connections are heavy on resources, and can slow down your whole cluster. Keep them running to boost performance, cut latency, and get the most out of your Kafka setup.

Why Relying on Default Settings Can Cost You! | Kafka Developer Mistakes

Default settings in Apache Kafka work when you’re getting started, but aren't suited for production. Sticking with defaults, like a seven-day retention policy, or a replication factor of one, can cause storage issues, or data loss in case of failure. Learn why optimizing retention periods, replication factors, and partitions, is crucial for better Kafka performance and reliability.

Luggage lost in a world of streaming data

The need to democratize and share data inside and outside your organization, as a real-time data stream, has never been more in demand. Treating real-time data as a product, and adopting Data Mesh practices, is the way forward. Here, we explain the concept through a real-life example of an airline building applications that process data across different domains.

Why Using Outdated Versions Hurts Your System! | Kafka Client Mistakes

Keeping your Apache Kafka clients up-to-date is critical for maximizing performance, security, and stability. In this video, we discuss why sticking with old versions could be putting you at risk, since it means you’re missing out on dozens of new features, and hundreds of bug fixes and security patches. Learn why upgrading is more than just a “nice-to-have”—it’s essential for a smoother and safer Kafka experience.