Systems | Development | Analytics | API | Testing

Latest Videos

Marketing and advertising data extraction made easy

Watch this video for a brief explanation of how to get all your ad campaign analytics data out of your ad platforms and see it all in one place without any coding, API extraction, or manually compiling data. Stitch partners with the most common ad platforms to help move your data from sources like Google Ads, Linkedin Ads, Facebook Ads including Instagram, TikTok for Business, Snapchat Ads, Adroll, Microisoft Bing Ads, and more into any data warehouse or data lake.

Maximize the value potential of your data with data excellence

To reach the value potential of your data, visibility of challenges and an incremental improvement path is key. Many organizations are overlooking some of the foundational changes required in both technology and culture to enable maturity to data excellence. Join Darren Brunt to explore the journey, benefits and potential challenges around establishing data excellence.

Setting up Google BigQuery as a data warehouse in minutes

In this tutorial, learn how to set up a new Google BigQuery cloud-based data warehouse account and extract data from all your data sources using Stitch in less than 3 minutes. Stitch partners with the most common data warehouses and data lakes to help move your data from sources like Shopify, MongoDB, LinkedIn Ads, Zapier, Hubspot, SendGrid, Google Analytics, and more. Google Analytics. Watch this step-by-step tutorial on how to set up Google BigQuery for data storage.

How to practice responsible AI - Scott Zoldi

This episode features an interview with Scott Zoldi. He is the Chief Analytics Officer at FICO where he is responsible for the analytic development of FICO’s product and technology solutions. Scott is involved in developing new analytic products and applications, and has authored more than 100 patents. His current focus is on self-learning analytics to detect cyber security attacks. On this episode, Scott talks about how to attract and retain world-class data scientists, the importance of following a model governance process, and responsible AI.

Being a Steward of Data and Insights - Robert Brown

This episode features an interview with Robert Brown, the Senior Director of Research for the Venture Forward Initiative at GoDaddy. This is his 13th year at GoDaddy, having started as Director of Database Marketing. Prior to GoDaddy, Robert served as Director of Pulte Homes for 9 years. On this episode, Robert talks about tiering data for smarter decisioning, developing intrinsic motivation in employees, and being a successful steward of data and insights.

Turning data into dollars - Philip O'Donnell

This episode features an interview with Philip O’Donnell, Group SVP of Data Platforms at the Adecco Group, the world’s leading talent advisory and solutions company. Philip has 13 years of experience in data analytics leadership and strategy consulting across a variety of industries. Prior to the Adecco Group, Philip served as Director of Data Science at Lee Hecht Harrison. On this episode, Philip discusses managing data at a big enterprise, how to prevent business decisions based on bad data, and turning data into dollars.

When business gets weird, the tough get healthy data - Talend CEO keynote at Talend Connect '21

Change is something every business leader has to deal with. But have you noticed that doing business has just gotten weird? An event that occurs around the world can suddenly have a profound effect on your company. How do you deal with constant, discombobulating change?

How to improve risk management and resiliency - Kelly Hereid

This episode features an interview with Dr. Kelly Hereid, Director of Catastrophe R&D at Liberty Mutual Insurance in the Corporate Enterprise Risk Management Group. Prior to Liberty Mutual, Kelly was a research scientist at Chubb in their primary side natural catastrophe unit. She has a Ph.D. in geological sciences from the University of Texas – Austin, focusing on climate science. On this episode, Kelly talks about using historical data to create catastrophe models, taking a strategic standpoint to invest in resiliency, and reducing vulnerability to changing hazards.