Analytics

HBase Clusters Data Synchronization with HashTable/SyncTable tool

Replication (covered in this previous blog article) has been released for a while and is among the most used features of Apache HBase. Having clusters replicating data with different peers is a very common deployment, whether as a DR strategy or simply as a seamless way of replicating data between production/staging/development environments.

Migrating Big Data to the Cloud

Unravel Data helps a lot of customers move big data operations to the cloud. Chris Santiago is Global Director of Solution Engineering here at Unravel. So Unravel, and Chris, know a lot about what can make these migrations fail. Chris and intrepid Unravel Data marketer Quoc Dang recently delivered a webinar, Reasons why your Big Data Cloud Migration Fails and Ways to Overcome. You can view the webinar now, or read on to learn more about how to overcome these failures.

Why Enhanced Visibility Matters for your Databricks Environment

Databricks has become a popular computing framework for big data as organizations increase their investments of moving data applications to the cloud. With that journey comes the promise of better collaboration, processing, and scaling of applications to the Cloud. However, customers are finding unexpected costs eating into their cloud budget as monitoring/observability tools like Ganglia, Grafana, the Databricks console only telling part of the story for charge/showback reports.

Re-thinking The Insurance Industry In Real-Time To Cope With Pandemic-scale Disruption

The Insurance industry is in uncharted waters and COVID-19 has taken us where no algorithm has gone before. Today’s models, norms, and averages are being re-written on the fly, with insurers forced to cope with the inevitable conflict between old standards and the new normal.

Welcome to data fabric - the architecture of the future

On average, data-driven companies grow more than 30% every year. Because of the competitive advantage that data confers to incumbents who are capable of extracting value from it, it has been called the new oil. Companies are tapping into this well of resources because of the advantages that it has to offer: But using data to run your operations poses its own set of challenges.

Understanding Snowflake's Resource Optimization Capabilities

The only certainty in today’s world is change. And nowhere is that more apparent than in the way organizations consume data. A typical company might have thousands of analysts and business users accessing dashboards daily, hundreds of data scientists building and training models, and a large team of data engineers designing and running data pipelines. Each of these workloads has distinct compute and storage needs, and those needs can change significantly from hour to hour and day to day.