Systems | Development | Analytics | API | Testing

Data Chief Live: External data: Your secret weapon in a cookie-less world

How do you get to know your customer in a cookie-less world? Join Rosemary Hua, Global Head of Retail & CPG GTM at Snowflake and Forbes 30 Under 30, Erik Mitchell, founder and principal at Seek Data, Nik Lampropoulos, Global Director of Data, Insights & Analytics, Hogarth Worldwide and Cindi Howson, ThoughtSpot CDSO, as they discuss questions like.

Talend acquires Gamma Soft

April 7, 2022, Talend, a global leader in data integration and management, announced today it has acquired Gamma Soft, a market innovator in change data capture (CDC). The addition of Gamma Soft’s highly complementary, enterprise-class change data capture technologies will help customers streamline their data modernization initiatives, including cloud migrations, and support advanced, real-time analytics use cases across hybrid and multi-cloud environments.

The World Beyond Test Automation: AI-Powered Intelligent Testing for Modern Applications

Web and mobile apps are now your primary connection with your customers. Staying relevant and winning market share requires that firms can make constant changes to these apps. But how can you deploy many more small changes - often many per day - with confidence and with managed risk? In the company of two software industry leaders, we take a closer look at how a modern testing toolchain combines production safety nets - from canaries, to feature flags, to error reporting - with AI-powered quality insights to engineer quality at speed for both developers and quality engineers.

SQL Puzzle Optimization: The UDTF Approach For A Decay Function

How do you implement a decay function in SQL? You can use window functions, which scale better than joins, or better yet, you can try what Felipe Hoffa did: use tabular UDFs. In this video, Felipe shows you how you can use a tabular UDF to write custom code that can analyze a table row by row while preserving state. Felipe wrote a table UDF in JavaScript that uses a low amount of memory to keep track of the decaying values. He was able to run it in 36 seconds, instead of the 46 seconds that the SQL with windows solution took; and then he optimized the JavaScript even further and ran it in just 9 seconds.

Data Warehouse Automation: What, Why, and How?

Building a data warehouse is an expensive affair and it often takes months to build one from scratch. There is also a constant struggle to keep up with the large volumes of data that is constantly generated. On top of that, setting up a strong architectural foundation, working on repetitive and mundane data validation tasks and ensuring data accuracy is another challenge. This puts tremendous stress on data teams and data warehouses. Data warehouse automation is intended to handle this growing complexity.

Hybrid Data Delivery "Cloud Sources" Walkthrough

We have expanded our Hybrid Data Delivery service to load analytics ready data, from a number of cloud-based data sources, directly to snowflake - without the need for Qlik replicate. This initial update currently allows you to connect to data from over 20 cloud-based data sources such as Amazon Redshift, Google BigQuery, and Salesforce and land it directly to a Snowflake as a target on a scheduled basis, so it can be used with your analytics applications – offering a single solution for on-prem and cloud data movement and replication.

AstraZeneca: Building a finance data hub

At AstraZeneca, supporting funcstions like Finance are intensely data-driven. Recently, the data and IT team completely overhauled their data architecture to better serve the needs of the Finance team, they decided to build a Finance data hub. In this video, key project stakeholders explain why and how they build the data hub for the finance team (using Talend and AWS), and they detail how it's integrated with other data hubs at astraZeneca.

Building Product Analytics At Petabyte Scale

Product analytics is the most critical and complex task for any product team. There are thousands of data points that have to be analyzed carefully while setting up the product analytics foundation and it enables product teams to use data to track, visualize, and analyze user engagement and behavior that can be used to improve and optimize a product experience. However, managing large data workloads can be very challenging as not all data that is collected can be directly used for analytics.