Systems | Development | Analytics | API | Testing

Break Data Silos: Build, Deploy and Serve Models at Scale with Snowflake ML

Despite the best efforts of many ML teams, most models still never make it to production due to disparate tooling, which often leads to fragmented data and ML pipelines and complex infrastructure management. Snowflake has continuously focused on making it easier and faster for customers to bring advanced models into production.

Scale Your Python Analytics With Pandas On Snowflake

Massive data sets can overwhelm native Pandas, causing memory issues and slow performance. Pandas on Snowflake eliminates these constraints by running Python code directly in Snowflake, with no rewrites needed. This demo shows how to transform and visualize large data sets using the familiar Pandas API with Snowflake’s distributed compute. Boost your data workflows and maintain security and governance, all while staying within the Pandas ecosystem.

How Retail and Media Leaders Drive Customer Satisfaction and Profits with Data and AI

Nearly nine out of 10 business leaders say their organizations’ data ecosystems are ready to build and deploy AI, according to a recent survey. But 84% of the IT practitioners surveyed spend at least one hour a day fixing data problems. Seventy percent spend one to four hours a day remediating data issues, while 14% spend more than four hours each day.

Providing Better Customer Experiences With The Help Of Cortex AI

Headquartered in Sydney, Domain Group is a leading Australian property marketplace. Its mission is to inspire confidence in life’s property decisions, and its property marketplace tools reach an average audience of 7 million Australians every month. A long-time Snowflake customer, Domain has started to work with Cortex Analyst in order to enable its staff to query its large data sets using natural language.

Strengthen Your Cloud Security With Snowflake's Trust Center

Snowflake’s Trust Center, now in general availability, delivers a unified way to identify, address, and monitor security risks. Discover how scanner packages (Security Essentials, CIS Benchmark, and more) streamline compliance checks, reduce costs, and simplify account security across multiple clouds. See how to configure scans, resolve vulnerabilities, and enforce best practices for user authentication, network policies, and more. Take a proactive approach to safeguarding sensitive data and learn how the Trust Center supports Snowflake’s shared security model.

Why Data Collaboration Projects Fail - and How Yours Can Succeed with a Data Clean Room

As privacy standards continue to evolve, businesses face a dual challenge: to uphold ethical standards for data use while seizing the opportunities offered by data collaboration. Enter data clean rooms: a privacy-enhancing solution that allows organizations to share valuable insights without compromising compliance.* If you're new to data clean rooms, our recent blog post “Data Clean Rooms Explained: What You Need to Know About Privacy-First Collaboration” breaks down the fundamentals.

Delivering The Right Message To The Right Person At The Right Time With Help From the AI Data Cloud

2degrees is a full-service telco, infrastructure owner, and energy retailer connecting people and businesses all around New Zealand. The combined business has approximately 1,600 employees who serve 2 million-plus customers.

Agentic AI in Financial Services and Insurance

Many financial services companies are experimenting with AI through pilot programs, but several challenges remain for adoption. Key concerns include data security, the accuracy of large language models (LLMs) and the rigorous scrutiny from regulators regarding AI’s role in financial decision-making. Current use cases are largely internal, with some customer-facing chatbot solutions addressing noncritical service inquiries.

Building Cost-Effective, Real-Time Pipelines with Snowflake Dynamic Tables

Join Sales Engineer Gabriel Mullen as he demonstrates how Snowflake’s Dynamic Tables streamline real-time data pipelines. Discover a simple, declarative approach to ingest data incrementally, maintain cost efficiency, and keep insights fresh. This demo will walk through setting target lag, leveraging incremental refreshes, and automating orchestration, allowing you to power analytics and BI dashboards with minimal overhead and maximum performance.

Scale Unstructured Text Analytics with Efficient Batch LLM Inference

Unstructured text is everywhere in business: customer reviews, support tickets, call transcripts, documents. Large language models (LLMs) are transforming how we extract value from this data by running tasks from categorization to summarization and more. While AI has proved that real-time conversations in natural language are possible with LLMs, extracting insights from millions of unstructured data records using these LLMs can be a game changer. This is where batch LLM inference becomes essential.