Analytics

How Developers Can Use Generative AI to Improve Data Quality

It sounds counterintuitive—using a technology that has trust issues to create more trustworthy data. But smart engineers can put generative AI to work to improve the quality of their data, allowing them to build more accurate and trustworthy AI-powered applications.

Empowering Analytics Teams: Qlik AutoML's Next Evolution

In today's data-driven business landscape, the ability to predict trends, explain drivers, and act on insights is no longer a luxury—it's a necessity. Yet for many organizations, the path from data to predictive intelligence remains challenging. Data science resources are scarce, and traditional tools often require specialized expertise that most analytics teams lack. At Qlik, we believe that the power of predictive analytics should be accessible to all.

Qlik AutoML Series - Predict, Explain, Act - Explainer

Predict, Explain, Act with Qlik AutoML, a powerful tool that brings automated machine learning to the hands of business users and data analysts. In this all new series, revist and learn how Qlik AutoML allows you to build predictive models without needing deep technical expertise in data science. Follow along in the next few videos linked in the description as Mike Tarallo walks you through the key features, from experiment to deployment, and see how Explain-ability is defined and used to gain insights that drive decision-making. Perfect for those looking to enhance their analytics with AI-powered predictions!

Connecting to Microsoft Azure SQL Server with Astera Data Stack

In this video, we'll guide you through the process of connecting to Microsoft Azure SQL Server using Astera Data Stack. Users can connect to Azure SQL Databases using various objects, including Database Table Source, Database Table Destination, and SQL-related tasks. Contents of the video: Introduction to connecting with Microsoft Azure SQL Server.

Shift left to write data once, read as tables or streams

Shift Left is a rethink of how we circulate, share and manage data in our organizations using DataStreams, Change Data Capture, FlinkSQL and Tableflow. It addresses the challenges with multi-hop and medallion architectures using batch pipelines by shifting the data preparation, cleaning and schemas to the point where data is created and as a result, you can build fresh trustworthy datasets as streams for operational use cases or Apache Iceberg tables for analytical use cases.

How to source data from AWS DynamoDB to Confluent using DynamoDB Streams and AWS Lambda

This is a one-minute video showing an animated architectural diagram of the integration between Amazon DynamoDB and Confluent Cloud using DynamoDB Streams and AWS Lambda. Details of the integration are provided via narration.