Systems | Development | Analytics | API | Testing

Data Ingestion vs. ETL: Understanding the Difference

Working with large volumes of data requires effective data management practices and tools, and two of the frequently used processes are data ingestion and ETL. Given the similarities between these two processes, non-technical people seek to understand what makes them different, often using search queries like “data ingestion vs ETL”.

Getting Started with Insurance Modernization

If you have a legacy system with customized capabilities and valuable features but nearing end of life, refactoring the system is a potential choice for modernization. Insurance platform modernization usually involves one-to-one code migration, which can often be more costly and time consuming than expected and typically tends to miss some of the integration and data architecture modernization that is foundational for getting the full value of digitization.

Accessibility Testing: Where do we start?

My first attempt to understand accessibility and how to test it started back in 2016. I was working for a company where accessibility was already an important part of the process. I was asked to perform accessibility testing on the product to determine whether we could claim to be accessible. Since it was a new subject for me, I was quite confused. Where should I start? Should I be certified to provide a proper assessment? Do I need to request assessments from third-party companies?

Tabular Reporting for NPrinting Users - Do More with Qlik

Join Michael Tarallo, along with special guests Product Manager Andrew Kruger and Principal Platform Architect Johnny Poole, for part 2 of Qlik Tabular Reporting. In this session, the focus shifts to the migration path for NPrinting users, exploring available utilities, migration paths and best practices for a seamless transition.

Special Episode: Fivetran and Databricks CEOs reveal the secret to AI

George Fraser, CEO and co-founder of Fivetran, and Ali Ghodsi, CEO and co-founder of Databricks, are building products that power the modern data stack. They offer an insider’s perspective on the hardest parts of building and deploying generative AI in the enterprise.