Systems | Development | Analytics | API | Testing

Developing a Basic Web Application using an Operational DB on CDP

In this video, you'll see a simple demo on how you can build a web application on top of a Cloudera Operational Database. We'll leverage the Apache Phoenix integration to easily write SQL statements against our database and use the python flask library to power the back end API calls. The web application will be hosted within Cloudera Machine Learning, showcasing some of the benefits of having your data within a hybrid data platform.

Processing DICOM Files With Spark on CDP Hybrid Cloud

In this video, you will see how you can use PySpark to process medical images from an MRI and convert them from DICOM format to PNG. The data is read from and written to AWS S3 and we leverage numpy and the pydicom libraries to do the data transformation. We are using data from the "RSNA-MICCAI Brain Tumor Radiogenomic Classification" Kaggle competition but this approach can be used for general purpose DICOM processing.

Future of Data Meetup: CDP on Azure - Industrial Strength Data Engineering

Data Engineering is undergoing a huge evolution requiring faster and more reliable data pipelines. Apache Spark and Python are core foundational components of this new architecture enabling data engineers to quickly develop these pipelines. They also introduce challenges when moving to production. Come join us as we: Ask questions and learn. We will also have a raffle of Cloudera swag.

What's New in CDP Public Cloud? Hive and Impala Get a Facelift

Join us LIVE to discuss what’s new in CDP Public Cloud! Don’t miss the live Q&A as we learn about the new capabilities in Cloudera Data Warehouse. See how the Impala and Hive engines get a facelift. Also watch a demo of how you can run advanced analytics at scale using few easy steps

Streaming Analytics with SQL Stream Builder

SQL Stream Builder, part of Cloudera Streaming Analytics, allows developers, analysts, and data scientists to write streaming applications using industry-standard SQL. It provides an interactive experience, so the development process is quick, easy, and productive while taking advantage of Apache Flink’s streaming power. It provides an advanced materialized view engine to interface with applications, tooling, and services via REST API.