Systems | Development | Analytics | API | Testing

Snowflake

Observability in Snowflake: A New Era with Snowflake Trail

Discovering and surfacing telemetry traditionally can be a tedious and challenging process, especially when it comes to pinpointing specific issues for debugging. However, as applications and pipelines grow in complexity, understanding what’s happening beneath the surface becomes increasingly crucial. A lack of visibility hinders the development and maintenance of high-quality applications and pipelines, ultimately impacting customer experience.

Introducing Snowflake Notebooks, an End-to-End Interactive Environment for Data & AI Teams

We’re pleased to announce the launch of Snowflake Notebooks in public preview, a highly anticipated addition to the Snowflake platform tailored specifically to integrate the best of Snowflake within a familiar notebook interface. Snowflake Notebooks aim to provide a convenient, easy-to-use interactive environment that seamlessly blends Python, SQL and Markdown, as well as integrations with key Snowflake offerings, like Snowpark ML, Streamlit, Cortex and Iceberg tables.

Celebrating Innovation and Excellence: Announcing Snowflake's Data Drivers

Snowflake announced the global winners of the sixth annual Data Drivers Awards, the premier data awards that honor Snowflake customers who are leading their organizations and transforming their industries with the AI Data Cloud. This year’s winners of the Data Drivers Awards include data leaders from across global organizations, including Caterpillar, Bentley, Mitsubishi Corporation, Zoom and more.

Snowflake Massively Expands Types of Applications That Can Be Built, Deployed and Distributed on Snowflake

Apps are the way to democratize AI: to make it accessible to everyone and streamline customers’ experiences with faster time to insights. According to a recent IDC survey, AI applications is currently the largest category of AI software, accounting for roughly one-half of the market’s overall revenue in 2023.

Simplified End-to-End Development for Production-Ready Data Pipelines, Applications, and ML Models

In today’s world, innovation doesn’t happen in a vacuum; collaboration can help technological breakthroughs happen faster. The rise of AI, for example, will depend on the collaboration between data and development. We’re increasingly seeing software engineering workloads that are deeply intertwined with a strong data foundation.

Introducing Polaris Catalog: An Open Source Catalog for Apache Iceberg

Open source file and table formats have garnered much interest in the data industry because of their potential for interoperability — unlocking the ability for many technologies to safely operate over a single copy of data. Greater interoperability not only reduces the complexity and costs associated with using many tools and processing engines in parallel, but it would also reduce potential risks associated with vendor lock-in.

Retail Media's Business Case for Data Clean Rooms Part 2: Commercial Models

In Part 1 of “Retail Media’s Business Case for Data Clean Rooms,” we discussed how to (1) assess your data assets and (2) define your data structures and permissions. Once you have a plan on paper, you can begin sizing the data clean room opportunity for your business.