Systems | Development | Analytics | API | Testing

SaaS In 60 - New filter, scheduler interval, write back and more!

This week we’ve added a new customizable filter object, a new interval in the scheduler for alerts and reloads, support for Parquet data files and the ability to execute an app automation workflow from a button object in Qlik Cloud Analytics with variable value passing which can be used for a number of advanced automated workflow use cases, including write back.

Creating a basic write back solution with Qlik Cloud

Using Qlik Cloud Analytics and Qlik Application Automation you can create sophisticated solutions to solve many business problems. With Qlik's new properties in the action button object, you can now execute an Application Automation workflow while passing parameter / value pairs to the workflow. Check out this simple walk-through to see an example of writing back data to a MS Azure SQL database.

Using Dead Letter Queues with SQL Stream Builder

Cloudera SQL Stream builder gives non-technical users the power of a unified stream processing engine so they can integrate, aggregate, query, and analyze both streaming and batch data sources in a single SQL interface. This allows business users to define events of interest for which they need to continuously monitor and respond quickly. A dead letter queue (DLQ) can be used if there are deserialization errors when events are consumed from a Kafka topic.

Building a Data-Centric Platform for Generative AI and LLMs at Snowflake

Generative AI and large language models (LLMs) are revolutionizing many aspects of both developer and non-coder productivity with automation of repetitive tasks and fast generation of insights from large amounts of data. Snowflake users are already taking advantage of LLMs to build really cool apps with integrations to web-hosted LLM APIs using external functions, and using Streamlit as an interactive front end for LLM-powered apps such as AI plagiarism detection, AI assistant, and MathGPT.

Discovering Data Monetization Opportunities in Financial Services

Data has become an essential driver for new monetization initiatives in the financial services industry. With the vast amount of data collected from customers, transactions, and market movements, among other sources, this abundance offers tremendous potential for financial institutions to extract valuable insights that can inform business decisions, improve customer service, and create new revenue streams.