Systems | Development | Analytics | API | Testing

Why Data Collaboration Projects Fail - and How Yours Can Succeed with a Data Clean Room

As privacy standards continue to evolve, businesses face a dual challenge: to uphold ethical standards for data use while seizing the opportunities offered by data collaboration. Enter data clean rooms: a privacy-enhancing solution that allows organizations to share valuable insights without compromising compliance.* If you're new to data clean rooms, our recent blog post “Data Clean Rooms Explained: What You Need to Know About Privacy-First Collaboration” breaks down the fundamentals.

Delivering The Right Message To The Right Person At The Right Time With Help From the AI Data Cloud

2degrees is a full-service telco, infrastructure owner, and energy retailer connecting people and businesses all around New Zealand. The combined business has approximately 1,600 employees who serve 2 million-plus customers.

Agentic AI in Financial Services and Insurance

Many financial services companies are experimenting with AI through pilot programs, but several challenges remain for adoption. Key concerns include data security, the accuracy of large language models (LLMs) and the rigorous scrutiny from regulators regarding AI’s role in financial decision-making. Current use cases are largely internal, with some customer-facing chatbot solutions addressing noncritical service inquiries.

Building Cost-Effective, Real-Time Pipelines with Snowflake Dynamic Tables

Join Sales Engineer Gabriel Mullen as he demonstrates how Snowflake’s Dynamic Tables streamline real-time data pipelines. Discover a simple, declarative approach to ingest data incrementally, maintain cost efficiency, and keep insights fresh. This demo will walk through setting target lag, leveraging incremental refreshes, and automating orchestration, allowing you to power analytics and BI dashboards with minimal overhead and maximum performance.

Scale Unstructured Text Analytics with Efficient Batch LLM Inference

Unstructured text is everywhere in business: customer reviews, support tickets, call transcripts, documents. Large language models (LLMs) are transforming how we extract value from this data by running tasks from categorization to summarization and more. While AI has proved that real-time conversations in natural language are possible with LLMs, extracting insights from millions of unstructured data records using these LLMs can be a game changer. This is where batch LLM inference becomes essential.

Data Quality Monitoring: Enabling Reliable, High-Integrity Data

In this demo, we’ll show you how to create a custom Data Metric Function (DMF), associate it with your tables for continuous data quality monitoring, and query the results from a centralized table. Watch to learn how built-in monitoring helps you track critical data objects, identify quality issues, and take quick action to ensure reliable, high-integrity data across your organization.

How Leaders in Financial Services and Manufacturing Accelerate Business Outcomes with Data and AI

Some 70% of organizations are actively exploring or implementing large language model (LLM) use cases, but fewer than a third of generative AI experiments have made it into production. A common hurdle? The inability to access and leverage the data crucial for running AI applications effectively. Snowflake’s Accelerate 2025 virtual events dive into the challenges and myriad opportunities offered by AI.

Spark NZ Sets Secure, Governed Data Foundations For The Era Of AI

Over the past few years, Spark New Zealand has tackled the challenge of creating a strong data foundation by moving all of its data warehouses into Snowflake to create a centralised data platform. Now, explains Pritha Chattopadhyay, Domain Chapter Lead at Spark, this telecommunications leader and digital services provider is diving into artificial intelligence with the help of Snowflake Cortex AI. Tune in to learn about the benefits it provides.