Systems | Development | Analytics | API | Testing

Building an automated data pipeline from BigQuery to Earth Engine with Cloud Functions

Over the years, vast amounts of satellite data have been collected and ever more granular data are being collected everyday. Until recently, those data have been an untapped asset in the commercial space. This is largely because the tools required for large scale analysis of this type of data were not readily available and neither was the satellite imagery itself. Thanks to Earth Engine, a planetary-scale platform for Earth science data & analysis, that is no longer the case.

Analyzing satellite images in Google Earth Engine with BigQuery SQL

Google Earth Engine (GEE) is a groundbreaking product that has been available for research and government use for more than a decade. Google Cloud recently launched GEE to General Availability for commercial use. This blog post describes a method to utilize GEE from within BigQuery’s SQL allowing SQL speakers to get access to and value from the vast troves of data available within Earth Engine.

How to simplify and fast-track your data warehouse migrations using BigQuery Migration Service

Migrating data to the cloud can be a daunting task. Especially moving data from warehouses and legacy environments requires a systematic approach. These migrations usually need manual effort and can be error-prone. They are complex and involve several steps such as planning, system setup, query translation, schema analysis, data movement, validation, and performance optimization.

Scaling Kafka Brokers in Cloudera Data Hub

This blog post will provide guidance to administrators currently using or interested in using Kafka nodes to maintain cluster changes as they scale up or down to balance performance and cloud costs in production deployments. Kafka brokers contained within host groups enable the administrators to more easily add and remove nodes. This creates flexibility to handle real-time data feed volumes as they fluctuate.

Editing and saving a dashboard

In this video you will learn how to edit one of your existing Yellowfin dashboards — such as adding a new report to a dashboard and then save those edits by publishing the dashboard. You will also learn how to edit/change the title of the dashboard, select/change the folders where the dashboard will be saved, and how to add tags to your dashboard. You will also learn how to edit/change the Dashboard Access to either Public or Private.

Enterprise data and analytics in the cloud with Microsoft Azure and Talend

The emergence of the cloud as a cost-effective solution to delivering compute power has caused a paradigm shift in how we approach designing, building, and delivering analytics to business users. Although forklifting an existing analytics environment into the cloud is possible, there’s substantial benefit for those that are willing to review and adjust their systems to capitalize on the strengths of the cloud.

Yellowfin Named Embedded Business Intelligence Software Leader in G2 Fall Reports 2022

Yellowfin has again been recognized in the Leader quadrant in the 2022 G2 Fall Grid Reports for Embedded Business Intelligence (Enterprise and Small Business). This is Yellowfin's 13th quarter in a row to be named a leader in a G2 Grid Report. The Yellowfin team are grateful to our customers for the reviews they have provided for our embedded analytics capability and product suite on G2, a leading business software and service comparison source for trusted user ratings and peer-to-peer reviews.

Webinar: Unlocking the Value of Cloud Data and Analytics

From data lakes and data warehouses to data mesh and data fabric architectures, the world of analytics continues to evolve to meet the demand for fast, easy, wide-ranging data insights. Right now, nearly 50% of DBTA subscribers are using public cloud services, and many are investing further in staff, skills, and solutions to address key technical challenges. Even today, the amount of time and resources most organizations spend analyzing data pales in comparison to the effort expended in identifying, cleansing, rationalizing, consolidating, and transforming that data.