Lenses

London, UK
2016
Mar 1, 2022   |  By Eleftherios Davros
Until recently, teams were building a small handful of Kafka streaming applications. They were usually associated with Big Data workloads (analytics, data science etc.), and data serialization would typically be in AVRO or JSON. Now a wider set of engineering teams are building entire software products with microservices decoupled through Kafka. Many teams have adopted Google Protobuf as their serialization, partly due to its use in gRPC.
Feb 16, 2022   |  By Christina Daskalaki
Kafka is a ubiquitous component of a modern data platform. It has acted as the buffer, landing zone, and pipeline to integrate your data to drive analytics, or maybe surface after a few hops to a business service. More recently, though, it has become the backbone for new digital services with consumer-facing applications that process live off the stream. As such, Kafka is being adopted by dozens, (if not hundreds) of software and data engineering teams in your organization.
Oct 4, 2021   |  By Antonios Chalkiopoulos
Today, I’m thrilled to announce that Lenses.io is joining Celonis, the leader in execution management. Together we will raise the bar in how businesses are run by driving them with real-time data, making the power of streaming open, operable and actionable for organizations across the world. When Lenses.io began, we could never have imagined we’d reach this moment.
Oct 4, 2021   |  By Stefan Bocutiu
Apache Kafka has grown from an obscure open-source project to a mass-adopted streaming technology, supporting all kinds of organizations and use cases. Many began their Apache Kafka journey to feed a data warehouse for analytics. Then moved to building event-driven applications, breaking down entire monoliths. Now, we move to the next chapter. Joining Celonis means we’re pleased to open up the possibility of real-time process mining and business execution with Kafka.
Sep 20, 2021   |  By Alex Durham
Here we are, our screens split and fingers poised to forage through two days of fresh Kafka content at Kafka Summit Americas. Tweet us your #KafkaSummit highlights if they’re missing here and we can add them to the round-up.
Aug 15, 2021   |  By Dave Harper
In March 2021, a 200,000 tonne ship got stuck in the Suez Canal, and the global shipping industry suddenly caught the world’s attention. It made us realize ships play an important role in our daily lives. Really important in fact; 90% of the things we consume arrive by ship. Take a look at this map. By visualizing vessel routes over time, the pattern creates a map of the earth. Note the lack of vessels travelling close to the coast of Somalia where piracy is common.
Aug 4, 2021   |  By Andrea Fiore
Suppose that you work for the infosec department of a government agency in charge of tax collection. You recently noticed that some tax fraud incident records went missing from a certain Apache Kafka topic. You panic. It is a common requirement for business applications to maintain some form of audit log, i.e. a persistent trail of all the changes to the application’s data. But for Kafka in particular, this can prove challenging.
Aug 4, 2021   |  By Andrea Fiore
Suppose that you work for a government tax agency. You recently noticed that some tax fraud incident records have been leaked on the darknet. This information is held in a Kafka Topic. The incident response team wants to know who has accessed this data over the last six months. You panic. It is a common requirement for business applications to maintain some form of audit log, i.e. a persistent trail of all the changes to the application’s data to respond to this kind of situation.
Jul 27, 2021   |  By Rafal Krupiński
We live in a dynamic world. It is safe to say that companies aim to speed up time-to-market and out-innovate their competition with Kafka, but at the same time struggle with some limitations. These can range from compliance-related setbacks for regulations such as GDPR, CCPA and HIPAA, to self-service slip-ups that could see a whole Kafka cluster going down. Even something as seemingly innocuous as configuring and creating a Kafka Topic can lead to operational U-turns, slowdowns and even downtime.
Jul 21, 2021   |  By Yiannis Glampedakis
In our world of streaming applications, developers are forever climbing a steep learning curve to stay successful with technologies such as Apache Kafka. There is no end to the debt and the detail you need to manage when it comes to Kafka - and particularly since it doesn’t come with guardrails to help you out, the stakes for making mistakes are high.
Apr 20, 2021   |  By Lenses
How to add metadata tags and descriptions to topics and other entities of your #Kafka real-time #datacatalog
Apr 15, 2021   |  By Lenses
Lenses 4.2 brings extends data observability into PostgreSQL instances to better explore and debug your streaming microservices. Here, Andrea walks you through creating a connection to #Postgres and exploring data with #SQL
Nov 24, 2020   |  By Lenses
Matteo de Martino from the Lenses.io Engines team shows you how to read, process & manipulate streaming data with #SQL in any #ApacheKafka. Plus a look at why we use SQL as a common language for streaming in the first place.
Nov 23, 2020   |  By Lenses
Adamos Loizou walks through how to wrangle data in #Kafka.
Oct 8, 2020   |  By Lenses
Automate your deployment of Lenses.io for your #AWS Managed Streaming for #ApacheKafka (#MSK) directly into your AWS VPC through portal.lenses.io. You'll be practicing #DataOps in minutes.
Oct 7, 2020   |  By Lenses
Explore Apache Kafka & Lenses.io through a demo environment packed with sample data and data flows.
Oct 6, 2020   |  By Lenses
David Esposito, a Solutions Architect from #Aiven, explores load testing in Kafka Office Hours, a recurring forum for #ApacheKafka thought-sharing hosted by our partner Lenses.io.
Sep 30, 2020   |  By Lenses
Walkthrough of how to create custom Serde classes that deserialise #Protobuf data in #ApacheKafka to explore and process in Lenses.io with #SQL
Aug 25, 2020   |  By Lenses
The new open-source #ApacheKafka Connect sink connector for #S3 gives you full control on how to sink data to S3 and save money on long term storage costs in #Kafka. The connector has the ability to flush data out in a number of different formats including #AVRO, #JSON, #Parquet and #Binary as well as ability to create S3 buckets based on partitions, metadata fields and value fields.
Aug 25, 2020   |  By Lenses
Demo of how to manage your #KafkaConnect connectors in Lenses.io with #SQL support for mapping fields between source & target systems, #RBAC and secret management to avoid exposing credentials in your configuration.
Jul 24, 2020   |  By Lenses
DataOps is the art of progressing from data to value in seconds. For us, its all about making data operations as easy and fast as using the email.
Jul 23, 2020   |  By Lenses
Apache Kafka is a popular and powerful component of modern data platforms. But it's complicated. Complicated to run, complex to manage and crucially - it's near impossible to drive Kafka adoption from the command line across your organization. So here's your how-to for seeing it through to production (... and possibly fame and fortune). We cover key principles for Kafka observability and monitoring.
Jul 1, 2020   |  By Lenses
Lenses, a DataOps platform, accelerates time to value, opening up data streams to a wide audience. Lenses enables rapid construction and deployment of data pipelines at scale, for enterprises, with governance and security.
Jul 1, 2020   |  By Lenses
Lenses, a DataOps platform, accelerates time to value, opening up data streams to a wide audience. Lenses enables rapid construction and deployment of data pipelines at scale, for enterprises, with governance and security.

Lenses ® is a DataOps platform that provides cutting edge visibility into data streams, integrations, processing, operations and enables ethical and secure data usage for the modern data-driven enterprise.

Accelerate time to value, open up data in motion to a wide audience, enable rapid construction and deployment of datapipelines at scale, for enterprises, with governance and security.

Why Lenses?

  • Confidence in production: Everyone’s scared of the dark. Lenses gives you visibility into your streams and data platform that you’ve never had. That means you don’t need to worry running in production.
  • Keeping things simple: Life’s hard enough without having to learn new languages and manage code deployments. Build data flows in minutes with just SQL. Reduce the skills needed to develop, package and deploy applications.
  • Being more productive: Build, debug and deploy new flows in minutes not days/weeks. In fact, many of our customers build and deploy apps to production 95% faster.

25,000+ Developers Trust Lenses for data operations over Kafka.