Systems | Development | Analytics | API | Testing

Keboola + ThoughtSpot = Automated insights in minutes

Keboola and ThoughtSpot partnered up to offer click-and-launch insights machines. With the original integration, you can already cut the time-to-insight. Keboola helps you get clean data and ThoughtSpot helps you turn it into insights. What’s new? The new solution builds out-of-the-box and ready-to-use data pipelines (Keboola Templates) and live self-serve analytic dashboards (ThoughtSpot SpotApps) from the ground up. You just need to click-and-launch your analytic use case.

Using Moesif's Live Event Log to Filter and Inspect API Calls and Events

As you may know, event logs are a common feature in operating systems and other software that keep track of system and application errors. When you have API traffic to follow or front-end actions you want to watch, using Moesif’s Live Event Log is a simple way to filter and find the data you need.

Power Your Lead Scoring with ML for Near Real-Time Predictions

Every organization wants to identify the right sales leads at the right time to optimize conversions. Lead scoring is a popular method for ranking prospects through an assessment of perceived value and sales-readiness. Scores are used to determine the order in which high-value leads are contacted, thus ensuring the best use of a salesperson’s time. Of course, lead scoring is only as good as the information supplied.

How To Use a Customer Data Platform (CDP) as Your Data Warehouse

Here’s what you need to know about how to use your customer data platform (CDP) as your data warehouse: Whether you’re a mom-and-pop store or an ecommerce giant, understanding the customer journey is crucial to your organization’s success. When you collect data across a wide range of customer touchpoints, you can use this wealth of information for many different use cases: performing audience segmentation, improving your marketing campaigns, boosting customer engagement, and more.

Complete ETL Process Overview (design, challenges and automation)

The Extract, Transform, and Load process (ETL for short) is a set of procedures in the data pipeline. It collects raw data from its sources (extracts), cleans and aggregates data (transforms) and saves the data to a database or data warehouse (loads), where it is ready to be analyzed. A well-engineered ETL process provides true business value and benefits such as: Novel business insights. The entire ETL process brings structure to your company’s information.

Star Schema vs Snowflake Schema and the 7 Critical Differences

Star schemas and snowflake schemas are the two predominant types of data warehouse schemas. A data warehouse schema refers to the shape your data takes - how you structure your tables and their mutual relationships within a database or data warehouse. Since the primary purpose of a data warehouse (and other Online Analytical Processing (OLAP) databases) is to provide a centralized view of all the enterprise data for analytics, data warehouse schemas help us achieve superior analytic results.

FinServ APIs: How to Improve Governance & Deploy with Confidence

Financial services innovation continues to progress at a breakneck pace. For example, fintech developers can programmatically spin up accounts, move money, and issue and manage cards with Increase or embed financial services into their marketplace with Stripe – capabilities that were unimaginable just a few years ago.

Data Governance and Strategy for the Global Enterprise

While the word “data” has been common since the 1940s, managing data’s growth, current use, and regulation is a relatively new frontier. Governments and enterprises are working hard today to figure out the structures and regulations needed around data collection and use. According to Gartner, by 2023 65% of the world’s population will have their personal data covered under modern privacy regulations.