Systems | Development | Analytics | API | Testing

Kafka

Lenses.io joins forces with Celonis to bring streaming data to business execution

Today, I’m thrilled to announce that Lenses.io is joining Celonis, the leader in execution management. Together we will raise the bar in how businesses are run by driving them with real-time data, making the power of streaming open, operable and actionable for organizations across the world. When Lenses.io began, we could never have imagined we’d reach this moment.

Introducing the Kafka to Celonis Sink Connector

Apache Kafka has grown from an obscure open-source project to a mass-adopted streaming technology, supporting all kinds of organizations and use cases. Many began their Apache Kafka journey to feed a data warehouse for analytics. Then moved to building event-driven applications, breaking down entire monoliths. Now, we move to the next chapter. Joining Celonis means we’re pleased to open up the possibility of real-time process mining and business execution with Kafka.

How to Load Test Your Kafka Producers and Consumers using k6

Recently, k6 started supporting k6 extensions to extend k6 capabilities for other cases required by the community. The community has already built plenty of extensions. k6 extensions are written in Go, and many of them are reusing existing Go libraries. This makes k6 to be a versatile tool to test different protocols and adapt to multiple cases. This post is the third part of my series of articles testing various systems using k6: Let's look in this post how we test the popular Kafka project.

Apache Kafka Deployments and Systems Reliability - Part 1

There are many ways that Apache Kafka has been deployed in the field. In our Kafka Summit 2021 presentation, we took a brief overview of many different configurations that have been observed to date. In this blog series, we will discuss each of these deployments and the deployment choices made along with how they impact reliability.

Operating Apache Kafka with Cruise Control

There are two big gaps in the Apache Kafka project when we think of operating a cluster. The first is monitoring the cluster efficiently and the second is managing failures and changes in the cluster. There are no solutions for these inside the Kafka project but there are many good 3rd party tools for both problems. Cruise Control is one of the earliest open source tools to provide a solution for the failure management problem but lately for the monitoring problem as well.

Event-Driven Architecture is unblocking data-driven decisions in shipping

In March 2021, a 200,000 tonne ship got stuck in the Suez Canal, and the global shipping industry suddenly caught the world’s attention. It made us realize ships play an important role in our daily lives. Really important in fact; 90% of the things we consume arrive by ship. Take a look at this map. By visualizing vessel routes over time, the pattern creates a map of the earth. Note the lack of vessels travelling close to the coast of Somalia where piracy is common.

Assessing security risks with Kafka audits

Suppose that you work for the infosec department of a government agency in charge of tax collection. You recently noticed that some tax fraud incident records went missing from a certain Apache Kafka topic. You panic. It is a common requirement for business applications to maintain some form of audit log, i.e. a persistent trail of all the changes to the application’s data. But for Kafka in particular, this can prove challenging.

Increase compliance with Kafka audits

Suppose that you work for a government tax agency. You recently noticed that some tax fraud incident records have been leaked on the darknet. This information is held in a Kafka Topic. The incident response team wants to know who has accessed this data over the last six months. You panic. It is a common requirement for business applications to maintain some form of audit log, i.e. a persistent trail of all the changes to the application’s data to respond to this kind of situation.