Introducing CDE: Purpose Built Tooling For Accelerating Data Pipelines Demo Highlight
For the full demo, click on this link:
Spark has become the de-facto processing framework for ETL and ELT workflows for good reason, but for many enterprises working with Spark has been challenging and resource-intensive. Leveraging Kubernetes to fully containerize workloads, DE provides a built-in administration layer that enables one-click provisioning of autoscaling resources with guardrails, as well as a comprehensive job management interface for streamlining pipeline delivery. DE enables a single pane of glass for managing all aspects of your data pipelines.
In this demo you will learn about:
- Easy deployment of jobs through a simple wizard and a flexible scheduling engine backed by Apache Airflow
- Operationalizing data pipelines from a single pane of glass from monitoring to self-service troubleshooting and tuning
- On-demand containerized compute allows auto-scaling to meet business SLAs while efficiently utilizing resources and managing costs by paying for what you use