Systems | Development | Analytics | API | Testing

Hands-on Session: Unlock AI-Powered Data Engineering on Snowflake

Your data team doesn’t need more tools. It needs fewer bottlenecks. What if you could go from raw data to production-ready pipelines and AI workflows in a single day? With Snowflake’s Cortex Code, teams can now build, optimize, and deploy data workflows using natural language, dramatically accelerating development inside the warehouse.

Matt Forrest's Journey into Geospatial: Building Data Systems the Right Way

Introducing Episode 1 of Data Builder Club: a series to celebrate the data leaders behind the most impactful data systems. In this episode we sit down with Matt Forrest, Director of Customer Engineering at Wherobots, geospatial advocate, and LinkedIn's go-to voice for modern data and spatial engineering. Matt opens up about his unconventional path into data, his philosophy around building reliable geospatial systems, and why a good foundation is the only thing that makes everything else possible.

Hevo's Next Evolution

Every company has an AI roadmap. Very few have the data infrastructure to execute it. At Hevo Data, we've spent 8 years building pipelines that are reliable, simple, and transparent so 2,000+ data teams can build without second-guessing their data. We sat down with Manish Jethani, Amit Gupta, and Scott Husband to talk about what comes next. If your data isn't AI-ready, your roadmap stays a roadmap. We've re-engineered the platform to serve as the context engine your AI vision actually runs on. Because the models are only as good as the data underneath them.

Mastering data ingestion with Apache Airflow: How to build reliable Pipelines

applications, and AI systems. But orchestration alone does not solve one of the biggest operational challenges: reliable data ingestion. In this live session, we explore how integrating Hevo directly into Airflow workflows creates a reliable foundation for modern ELT pipelines. Through native operators, sensors, and triggers, teams can orchestrate ingestion, monitor pipeline health, and ensure downstream analytics and AI workloads always run on trusted data.

Demo days: Reliability Under Pressure: How to Build Self-recovering Data Pipelines

Modern data pipelines don’t fail loudly. A schema change slips through. A few bad records halt ingestion. Dashboards go stale. Engineers rerun backfills. Warehouse costs spike. Business teams begin to question the data. Pipeline instability and silent failures remain some of the biggest bottlenecks for analytics teams operating at scale.

Build agentic AI in minutes on Snowflake

Agentic AI doesn’t have to mean months of architecture work, custom orchestration layers, or external platforms. In this hands-on workshop, you’ll build Snowflake Intelligence agents using native Snowflake capabilities to reason over structured data, retrieve context from unstructured sources, and execute multi-step analysis directly inside Snowflake within minutes.

Hevo demo days: Live workshop- build a production-ready Pipeline in 10 minutes.

Data drives every strategic move today, from real-time customer experiences to AI that powers business outcomes. But the infrastructure behind that data is often anything but modern. Pipelines are fragile, visibility is limited, and every schema change or API hiccup turns into engineering firefighting. Hours get lost to manual fixes and maintenance instead of innovation. When reliable data depends on constant human attention, the pace of the business suffers.

Maximize Snowflake ROI with Hevo & Astrato

This 45-minute session dives into how Hevo, Snowflake and Astrato come together to give data engineers a faster, cleaner and far more scalable workflow. With Snowflake Data Superhero Piers Batchelor and Martin Mahler, CEO & Founder of Astrato Analytics leading the discussion, you will see how to cut pipeline friction, reduce refresh delays and deliver near real-time insights without constantly tuning or fighting your stack.

Achieve Complete Observability Over Your Data Pipelines

Data teams today manage hundreds of moving parts, from sources and syncs to transformations and warehouses, yet often find out something’s broken only after dashboards go blank or reports turn unreliable. When every second of data downtime affects critical decisions, visibility isn’t optional; it’s essential.