Systems | Development | Analytics | API | Testing

Kafka

Scaling Kafka Brokers in Cloudera Data Hub

This blog post will provide guidance to administrators currently using or interested in using Kafka nodes to maintain cluster changes as they scale up or down to balance performance and cloud costs in production deployments. Kafka brokers contained within host groups enable the administrators to more easily add and remove nodes. This creates flexibility to handle real-time data feed volumes as they fluctuate.

How to extend Kafka pipelines to users over the public internet

In the realms of event streaming, Kafka has become the number one choice for many organizations. It’s capable of handling and processing vast amounts of critical, time-sensitive event data. But Kafka isn’t designed for distributing data between internal systems and consumers on the public internet. Yet rich, live digital experiences are a must-have as far as users are concerned. They are increasingly becoming a key source of differentiation. In this webinar we will show you how you can harness the power of Kafka, to deliver engaging realtime application to users, using Ably.

Kafka best practices: Monitoring and optimizing the performance of Kafka applications

Apache Kafka is an open-source distributed event streaming platform used by thousands of companies for high-performance data pipelines, streaming analytics, data integration, and mission-critical applications. Administrators, developers, and data engineers who use Kafka clusters struggle to understand what is happening in their Kafka implementations.

Building a dependable realtime betting app with Confluent Cloud and Ably

Our everyday digital experiences are in the midst of a revolution. Customers increasingly expect their online experiences to be interactive, immersive, and realtime by default. The need to satisfy user expectations is driving the exponential growth of event-driven architectures in organizations of all shapes and sizes. And by enabling users to have realtime experiences whenever and wherever they want, 24/7, mobile drives this change further and faster.

The Ably Kafka Connector - now Generally Available with enhanced pattern-based mapping capabilities

The Ably Kafka Connector has a raft of new enhancements, and is now available in a full general availability (GA) release. Developers now have the option to use pattern-based mapping rules to enable streaming of data from many Kafka topics to many Ably channels - ideal for chat solutions, live sports updates, live streaming, broadcasting notifications and alerts.

Ably launch Kafka Connector at Kafka Summit 2022 - London

Here at Ably, we're excited to announce our participation and Silver sponsorship of the Kafka Summit 2022, taking place between 25-26 April. The Kafka Summit is the only dedicated technical conference for the Apache Kafka® Community, and it's a great opportunity for anyone building large-scale event-driven systems to learn and share ideas. It's also the perfect event for us to launch the general availability of the Ably Kafka Connector.

Apache Kafka to BigQuery: 2 Easy Methods

Organizations today have access to a wide stream of data. Data is generated from recommendation engines, page clicks, internet searches, product orders, and more. It is necessary to have an infrastructure that would enable you to stream your data as it gets generated and carry out analytics on the go. To aid this objective, incorporating a data pipeline for moving data from Apache Kafka to BigQuery is a step in the right direction.

Producing Protobuf data to Kafka

Until recently, teams were building a small handful of Kafka streaming applications. They were usually associated with Big Data workloads (analytics, data science etc.), and data serialization would typically be in AVRO or JSON. Now a wider set of engineering teams are building entire software products with microservices decoupled through Kafka. Many teams have adopted Google Protobuf as their serialization, partly due to its use in gRPC.