Making Data Ingest and Pipelines Use Case Driven - Webinar with Equalum & GigaOm

Making Data Ingest and Pipelines Use Case Driven - Webinar with Equalum & GigaOm

Feb 17, 2021

How can an organization ensure it has access to the best data, of the highest quality, with as little delay as possible? Getting there is a matter of policy, business priority, and technical architecture, as well.

Streaming data processing, classic Extract-Transform-Load (ETL), and change data capture (CDC)-based data replication each involve the ingestion, inspection, and movement of data. While typically used in different contexts and distinct platforms, it’s possible to “factor out” the differences, with corresponding benefits in tow. Technology stack precedent notwithstanding, organizations can look at all of these technologies through the wide-angle lens of processing data in real-time or in batch.

Viewed this way, the three can work together, in service of real-time data ingest and transformation, DataOps, ELT (Extract-Load-Transform), and classic ETL scenarios. Picking one of these approaches becomes a question of design rather than a choice of platform, skillset, or vendor. As requirements evolve, changing the ingest approach becomes a supported modification, avoiding re-platforming and developing pipelines from scratch.

Want to adopt this outlook that puts data and insights at the center of attention, with technology as a facilitator? The webinar features GigaOm analyst Andrew Brust and special guest, Nir Livneh, CEO and Founder of Equalum, a specialist in streaming and CDC-powered modern data integration.


  • The correlation of business data strategies and ingest approaches
  • The challenges and strengths of streaming, CDC and batch data processing
  • How to standardize on open source technologies without new skillset burdens