Systems | Development | Analytics | API | Testing

How to start a data literacy program in 6 steps

In a world where 2.5 quintillion bytes of data are created every day, it’s not surprising that organizations want to harness the power of being data-driven. In our 2022 Data Health Barometer, 99% of companies surveyed recognized that data is crucial for success — but 97% said they face challenges in using data effectively. Perhaps in response to those challenges, 65% of companies reported that they'd started a data literacy program.

Product announcement: Keboola is launching no-code transformations!

In this new exciting development, Keboola is launching no-code data transformations for everybody on the platform. No-code transformations empower users without the technical know-how to build robust, feature-rich applications without having a degree in computer science or waiting for the IT department to develop the apps for them.

Requirement Gathering Blog Series, Part 5: Dealing with Objections

This is Part 5 of the Requirement Gathering Blog series by Rahul Parwal where we explore more on Dealing with Objections. We’d like to thank him for sharing his expertise with the community through this information-packed piece. When facing the issue of lacking proper documentation of requirements, it is important for testers to shift their perspective and focus on the needs and wants of the end-users or customers.

Building GraphQL APIs with PostgreSQL: Top Developer Tools to Consider

Developers often build high-performing, scalable applications using GraphQL and PostgreSQL to define data structure and achieve reliability, scalability, and high performance. First, however, selecting the appropriate framework to simplify and streamline the development process is crucial while building a GraphQL API with PostgreSQL. This blog will explore the top tools for building GraphQL APIs with PostgreSQL, including Hasura, Postgraphile, Prisma, and GraphQL Nexus.

Building a Data-Centric Platform for Generative AI and LLMs at Snowflake

Generative AI and large language models (LLMs) are revolutionizing many aspects of both developer and non-coder productivity with automation of repetitive tasks and fast generation of insights from large amounts of data. Snowflake users are already taking advantage of LLMs to build really cool apps with integrations to web-hosted LLM APIs using external functions, and using Streamlit as an interactive front end for LLM-powered apps such as AI plagiarism detection, AI assistant, and MathGPT.

Using Dead Letter Queues with SQL Stream Builder

Cloudera SQL Stream builder gives non-technical users the power of a unified stream processing engine so they can integrate, aggregate, query, and analyze both streaming and batch data sources in a single SQL interface. This allows business users to define events of interest for which they need to continuously monitor and respond quickly. A dead letter queue (DLQ) can be used if there are deserialization errors when events are consumed from a Kafka topic.