Systems | Development | Analytics | API | Testing

Latest Videos

How Piano Helps Companies Use Captured Data to Understand Customer Behavior

Big B2C companies looking to better understand their customers and influence their behavior are turning to Piano’s analytics and activation platform to glean insights from captured data. In this episode of “Powered by Snowflake,” Daniel Myers chats with Piano CEO Trevor Kaufman how his company’s “digital experience cloud” takes behavioral data from customer website interactions from across the digital ecosystem and gives its clients the ability to run predictive analytics and create visualizations and reports that provide critical information about those customers' habits, interests, transactions, and more.

Billie Boosts Innovation With Snowflake Data Cloud

Billie is among the fastest-growing financial services startups globally—providing German companies with flexible B2B “buy now, pay later” transactions. In this video, Igor Chtivelband, VP of Data, takes us on the Billie & Snowflake success journey, from migrating Billie's data warehouse to the Snowflake Data Cloud to better identify potentially fraudulent transactions.

Lightweight Batch Streaming

What’s an easy, low-cost way to batch stream data into Snowflake in near real-time? In this episode of Snowflake Bytes, Felipe Hoffa uses RSVP data from Meetup to demonstrate how you can use Google Cloud Pub/Sub and Snowpipe to do just that. Meetup publishes about 4 thousand RSVPs for events per hour that it can use to analyze what’s happening around the world. In the video, you’ll find a link to Hoffa’s blog post, which shares the code he used to create a lightweight pipeline to automatically ingest each of these files into Snowflake so that they can be queried.

How To Scale Threat Detection & Response With Snowflake & Securonix

Securonix brings exciting new capabilities to the field of threat detection and response thanks to its integration with Snowflake. Together, the two companies provide a split-architecture solution that solves the problem of data silos and enables organizations to make better, more timely decisions about potential threats to their organization. It’s a next-gen SIEM solution already finding widespread use in such organizations as healthcare institutions, airlines, and telecommunication companies.

Elevate Gives Retailers a Powerful New Tool for Managing Supply Chains

In today’s world, retail customers expect things fast. They want their products on time and they want their orders not to be canceled. And when things go wrong, they want answers. To deliver that experience, retailers need to be able to understand at a granular level how their customers’ orders are moving through their supply chains. In this episode of “Powered by Snowflake,” Daniel Myers chats with Elevate Co-founder and CTO James Sutton about his company’s recently introduced retail operations platform that provides the analytics retailers need to evaluate and manage supply chain performance.

Analyzing Unstructured Data With Snowflake Explained In 90 Seconds

What if there was a way to easily manage, process, and analyze any data type in a single platform? Snowflake is here to help. Simplify your architecture with a single platform for all data types and workloads, unlocking new use cases for your data. With Snowpark, your data scientists and engineers can securely build scalable, optimized pipelines, and quickly and efficiently execute machine learning workflows while working in Python, Java, or Scala.

SQL Puzzle Optimization: The UDTF Approach For A Decay Function

How do you implement a decay function in SQL? You can use window functions, which scale better than joins, or better yet, you can try what Felipe Hoffa did: use tabular UDFs. In this video, Felipe shows you how you can use a tabular UDF to write custom code that can analyze a table row by row while preserving state. Felipe wrote a table UDF in JavaScript that uses a low amount of memory to keep track of the decaying values. He was able to run it in 36 seconds, instead of the 46 seconds that the SQL with windows solution took; and then he optimized the JavaScript even further and ran it in just 9 seconds.