Ep 59: New Zealand's Crown Research Institute CDAO, Jan Sheppard on Treating Data as a Treasure

Treating data as a treasure is a foundational principle for Jan Sheppard, the Chief Data and Analytics officer at New Zealand’s Crown Research Institute of Environmental Science and Research (ESR.) This agency leads ongoing research in public health, environmental health, and forensics for the country of New Zealand. Like many other CDAOs, her role is relatively new. But the unique values she applies to data can be traced back many hundreds of years to the indigenous Maori people of her country. Through her work, Jan recognizes the profound impact data can have on people and their environments for generations to come.

What Challenges Are Hindering the Success of Your Data Lake Initiative?

Conventional databases are no longer the appropriate solution in a world where data volume is growing every second. Many modern businesses are adopting big data technologies like data lakes to counter data volume and velocity. Data lake infrastructures such as Apache Hadoop are designed to handle data in large capacities. These infrastructures offer benefits such as data replication for enhanced protection and multi-node computing for faster data processing.

7 Best Data Pipeline Tools 2022

The data pipeline is at the heart of your company’s operations. It allows you to take control of your raw data and use it to generate revenue-driving insights. However, managing all the different types of data pipeline operations (data extractions, transformations, loading into databases, orchestration, monitoring, and more) can be a little daunting. Here, we present the 7 best data pipeline tools of 2022, with pros, cons, and who they are most suitable for. 1. Keboola 2. Stitch 3. Segment 4.

Introduction to Automated Data Analytics (With Examples)

Is repetitive and menial work impeding your data scientists, analysts, and engineers from delivering their best work? Consider automating your data analytics to free their hands from routine tasks so they can dedicate their time to doing more meaningful, creative work that requires human attention. In this blog we are going to talk about: Now let’s dive in.

Yellowfin Named Embedded Business Intelligence Software Leader in G2 Fall Reports 2022

Yellowfin has again been recognized in the Leader quadrant in the 2022 G2 Fall Grid Reports for Embedded Business Intelligence (Enterprise and Small Business). This is Yellowfin's 13th quarter in a row to be named a leader in a G2 Grid Report. The Yellowfin team are grateful to our customers for the reviews they have provided for our embedded analytics capability and product suite on G2, a leading business software and service comparison source for trusted user ratings and peer-to-peer reviews.

Webinar: Unlocking the Value of Cloud Data and Analytics

From data lakes and data warehouses to data mesh and data fabric architectures, the world of analytics continues to evolve to meet the demand for fast, easy, wide-ranging data insights. Right now, nearly 50% of DBTA subscribers are using public cloud services, and many are investing further in staff, skills, and solutions to address key technical challenges. Even today, the amount of time and resources most organizations spend analyzing data pales in comparison to the effort expended in identifying, cleansing, rationalizing, consolidating, and transforming that data.

Talend's contributions to Apache Beam

Apache Beam is an open-source, unified programming model for batch and streaming data processing pipelines that simplifies large-scale data processing dynamics. The Apache Beam model offers powerful abstractions that insulate you from low-level details of distributed data processing, such as coordinating individual workers, reading from sources and writing to sinks, etc.