Systems | Development | Analytics | API | Testing

Observability

Panel recap: What Is DataOps observability?

Data teams and their business-side colleagues now expect—and need—more from their observability solutions than ever before. Modern data stacks create new challenges for performance, reliability, data quality, and, increasingly, cost. And the challenges faced by operations engineers are going to be different from those for data analysts, which are different from those people on the business side care about. That’s where DataOps observability comes in.

Data Observability: 7 Trends to Watch in 2023

As organizations look to scale up and improve the business value of their growing data volumes, certain data trends have garnered the attention of data and business professionals alike. With this growth promising to continue in the upcoming year, data leaders are looking to implement tools to enrich their organization’s data like never before. Here are seven trends you can watch for in the new year.

What is DataOps Observability?

Data teams like yours face new challenges as they manage an increasing variety of data formats, expanding use cases, and as data volumes double every three years. Organizations increasingly depend on new data products to meet their financial objectives. Join SanjMo Advisory Services Co-Founder Sanjeev Mohan and Unravel Data Vice President of Solutions Engineering Chris Santiago to learn.

What Is Data Observability in a Data Pipeline?

The five things you need to know about data observability in a data pipeline are: Becoming a data-driven organization is a vital goal for businesses of all sizes and industries—but this is easier said than done. Too many companies fail to attain the fundamental principle of data observability: knowing the existence and status of all the enterprise data at their fingertips.

Data 'Poka-Yoking' With Data Observability for the Modern Data Stack

While in the past, businesses used data to gain an edge over their rivals, in today’s competitive environment, data is imperative to stay in business. Modern businesses rely increasingly on data to manage all aspects of their operations, from everyday workflows to impacts on business strategy and customer interactions. As a result, data stacks have become extremely complex.

Transforming Kong Logs for Ingestion into Your Observability Stack

As a Solutions Engineer here at Kong, one question that frequently comes across my desk is “how can I transform a Kong logging plugin message into a format that my insert-observability-stack-here understands, i.e. ELK, Loki, Splunk, etc.?” In this blog, I’m going to show you how to easily accomplish converting a Kong logging payload to the Elastic Common Schema. In order to accomplish this task, we’re going to be running Kong Gateway in Kubernetes and using two Kong plugins.

Kensu partners with Collibra to automate data catalog completion

Kensu announces its partnership with Collibra, the Data Intelligence company, and the availability of an integration between the two solutions. Kensu's observability capacities will enrich Collibra's Catalog with clean, trustworthy, and curated information to enable business users and data scientists to make business decisions based on reliable data.

Does Your Company Need a Data Observability Framework?

You have been putting in the work, and your company has been growing manifold, Your client base is growing more than ever, and the projects are pouring in. So what comes next? it is now time to focus on the data that you are generating. When programming an application, DevOps engineers keep track of many things, such as bugs, fixes, and the overall application performance. This ensures that the application operates with minimum downtime and that any future errors can be predicted.

Integrating Observability into Your Security Data Lake Workflows

Today’s enterprise networks are complex. Potential attackers have a wide variety of access points, particularly in cloud-based or multi-cloud environments. Modern threat hunters have the challenge of wading through vast amounts of data in an effort to separate the signal from the noise. That’s where a security data lake can come into play.