Systems | Development | Analytics | API | Testing

Latest Videos

Cost-Effective, High-Performance Move to Cloud

The move to cloud may be the biggest challenge, and opportunity, facing IT departments today. In this 45-minute webinar, Unravel Data product marketer Floyd Smith and Solutions Engineering Director Chris Santiago describe how to move workloads to the cloud quickly, cost-effectively, and with high performance for the newly cloud-based workloads. Tune in to find out the best way to de-risk your cloud migration projects with data driven insights.

Cost Optimization on Microsoft Azure

Do you use big data and streaming services - such as Azure HDInsight, Databricks, and Kafka/EventHubs? Do you have on-premises big data that you want to move to Azure? Keeping costs down in Microsoft Azure is difficult, but vital. Join Chris Santiago of Unravel Data and explore how to to reduce, manage, and allocate streaming data and big data costs in Azure.

Why Enhanced Visibility Matters for your Databricks Environment

Databricks has become a popular computing framework for big data as organizations increase their investments of moving data applications to the cloud. With that journey comes the promise of better collaboration, processing, and scaling of applications to the Cloud. However, customers are finding unexpected costs eating into their cloud budget as monitoring/observability tools like Ganglia, Grafana, the Databricks console only telling part of the story for charge/showback reports.

Reasons why your Big Data Cloud Migration Fails and Ways to Overcome

The Cloud brings many opportunities to help implement big data across your enterprise and organizations are taking advantage of migrating big data workloads to the cloud by utilizing best of breed technologies like Databricks, Cloudera, Amazon EMR and Azure HDI to name a few. However, as powerful as these technologies are, most organizations that attempt to use them fail. Join Chris Santiago, Director of Solution Engineering as he shares the top reasons why your big data cloud migration fails and ways to overcome it.

5 Ways to Slash your Data Platform Costs

Make your data platform faster, better & cheaper with Unravel by joining Chris Santiago, Director of Solution Engineering to learn how to reduce the time troubleshooting and the costs involved in operating your data platform. Instantly understand why technologies such as Spark applications, Kafka jobs, and Impala underperform or even fail! Define and meet enterprise service levels through proactive reporting and alerting.

Migrating Big Data Workloads to the Cloud with Unravel

The movement to utilize data to drive more effective business outcomes continues to accelerate. But with this acceleration comes an explosion of complex platforms to collect, process, store, and analyze this data. Ensuring these platforms are utilized optimally is a tremendous challenge for businesses. Join Mick Nolen at Senior Solutions Engineer at Unravel Data, as he takes you through Unravel’s approach to migrating big data workloads to the Cloud. Whether you’re migrating from

CDO Sessions: Getting Real with Data Analytics

Big data leaders are no doubt being challenged with market uncertainty. Data-driven insights can help organizations assess, and uncover market risk and opportunities that may arise during uncertain times. As businesses around the world adapt to digitization initiatives, modern data systems have become more mission critical toward continuity and competitive differentiation.

Amazon EMR Insider Series: Optimizing big data costs with Amazon EMR & Unravel

Data is a core part of every business. As data volumes increase so do costs of processing it. Whether you are running your Apache Spark, Hive, or Presto workloads on-premise or on AWS, Amazon EMR is a sure way to save you money. In this session, we’ll discuss several best practices and new features that enable you to cut your operating costs and save money when processing vast amounts of data using Amazon EMR.