Systems | Development | Analytics | API | Testing

Snowflake: The Platform For The AI Era

The age of AI has arrived, bringing unprecedented opportunities and complex new challenges. The Snowflake AI Data Cloud is the unified platform and connected ecosystem built for this new era of innovation. Our platform empowers organizations to break down silos, activate their data and build powerful AI applications with speed and trust. Join us and discover why Snowflake is the place where data does more.

Break the Boundaries Between Product and UX with Embedded Intelligence

For years, product teams from software companies have faced the same uphill battle: deliver analytics that hopefully fulfill their customers’ expectations while keeping their own roadmaps on track. Too often, the result is static dashboards tacked onto an application—uninspiring, difficult to maintain, and disconnected from user workflows. Meanwhile, customer expectations have evolved. They want analytics that feels alive, intelligent, and seamlessly part of the products they use every day.

Ep 42 | AI as a New Class of Risk with Ojas Rege

AI as a New Class of Risk with Ojas Rege. AI is changing the way businesses operate. But without trust, governance, and accountability, progress stalls. Ojas Rege, SVP & GM, Privacy & Data Governance at OneTrust joins The AI Forecast to explore how organizations can balance innovation with responsibility. He and host Paul Muller unpack why AI represents a “new class of risk,” what it means to design privacy and governance into systems from the very beginning, and how curiosity and context fuel better decision-making with data.

Scaling Kafka Streams Applications: Strategies for High-Volume Traffic

As the adoption of real-time data processing accelerates, the ability to scale stream processing applications to handle high-volume traffic is paramount. Apache Kafka, the de facto standard for distributed event streaming, provides a powerful and scalable library in Kafka Streams for building such applications. Scaling a Kafka Streams application effectively involves a multi-faceted approach that encompasses architectural design, configuration tuning, and diligent monitoring.

Cross-Data-Center Apache Kafka Replication: Decision Framework & Readiness Playbook

Building distributed systems is a huge undertaking, but the complexity doesn’t end once your application or platform is “production ready.” Keeping these systems online and operational through cloud region outages, a network partition, or just scheduled maintenance is a constant challenge. The bottom line: you don’t want data pipelines for essential business services, customer-facing products, or enterprise data platforms to go dark.