Systems | Development | Analytics | API | Testing

Introducing Snowflake Notebooks, an End-to-End Interactive Environment for Data & AI Teams

We’re pleased to announce the launch of Snowflake Notebooks in public preview, a highly anticipated addition to the Snowflake platform tailored specifically to integrate the best of Snowflake within a familiar notebook interface. Snowflake Notebooks aim to provide a convenient, easy-to-use interactive environment that seamlessly blends Python, SQL and Markdown, as well as integrations with key Snowflake offerings, like Snowpark ML, Streamlit, Cortex and Iceberg tables.

Episode 10: Exploring data innovation in healthcare | Merilytic Health

Erin Rebholz, former McKesson executive and President of Merilytic Health, discusses the role of cloud-based solutions, data governance and interoperability in improving clinical outcomes and patient care. Erin emphasizes the importance of effective data sharing and the potential impact of AI and machine learning on healthcare, offering insights into evolving data analytics practices that can empower healthcare providers.

The Award Winning Formula: How Cloudera Empowered OCBC With Trusted Data To Unlock Business Value from AI

Recently, Cloudera, alongside OCBC, were named winners in the“Best Big Data and Analytics Infrastructure Implementation” category at The Asian Banker’s Financial Technology Innovation Awards 2024. This recognition underscores the importance of trusted data when building AI and generative AI (GenAI) models and serves as a testament to the impact that reliable data can have in real world use cases.

How to Analyze Data from a REST API with Flink SQL

Join Lucia Cerchie in a coding walkthrough, bridging the gap between REST APIs and data streaming. Together we’ll transform the OpenSky Network's live API into a data stream using Kafka and Flink SQL. Not only do we change the REST API into a data stream in this walkthrough, but we clean up the data on the way! We use Flink SQL to make it more readable and clean, and in that way we keep more of the business logic away from the client code.

What is Streaming ETL?

Streaming ETL is a modern approach to extracting, transforming, and loading (ETL) that processes and moves data from source to destination in real-time. It relies on real-time data pipelines that process events as they occur. Events refer to various individual pieces of information within the data stream. Depending on the source and purpose of the data, an event could be a single user visit to a website, a new post on a social media platform, or a data point from a temperature sensor.

Databricks vs. Snowflake: A Comparative Analysis

With the data management landscape continuously evolving, it has given rise to powerful platforms like Databricks and Snowflake, each offering distinct capabilities for organizations to manage and analyze their data efficiently. Our 5 key takeaways in the Databricks vs. Snowflake debate are: In this article, we will dive into a comprehensive comparison of Databricks and Snowflake and examine the data companies’ features, performance, scalability, and more.