Systems | Development | Analytics | API | Testing

June 2019

Data Pipelines and the Promise of Data

The flow of data can be perilous. Any number of problems can develop during the transport of data from one system to another. Data flows can hit bottlenecks resulting in latency; it can become corrupted, or datasets may conflict or have duplication. The more complex the environment and intricate the requirements, the more the potential for these problems increases. Volume also increases the potential for problems. Transporting data between systems often requires several steps.

Meeting SLAs for Data Pipelines on Amazon EMR With Unravel

A household name in global media analytics – let’s call them MTI – is using Unravel to support their data operations (DataOps) on Amazon EMR to establish and protect their internal service level agreements (SLAs) and get the most out of their Spark applications and pipelines. Amazon EMR was an easy choice for MTI as the platform to run all their analytics. To start with, getting up and running is simple. There is nothing to install, no configuration required etc.

Making Data Work With the Unravel Partner Program

Modern data apps are increasingly going to the cloud due to their elastic compute demands, skills shortages and the complexity of managing big data on premises. And while more and more organizations are taking their data apps to the cloud to leverage its flexibility, they’re also finding that it is very challenging to assess application needs and how to migrate and optimize their data to ensure performance and cost targets are not compromised.

Unraveling the Complex Streaming Data Pipelines of Cybersecurity

Earlier this year, Unravel released the results of a survey that looked at how organizations are using modern data apps and general trends in big data. There were many interesting findings, but I was most struck by what the survey revealed about security. First, respondents indicated that they get the most value from big data when leveraging it for use in security applications. Fraud detection was listed as the single most effective use case for big data, while cybersecurity intelligence was third.