Systems | Development | Analytics | API | Testing

December 2021

Heroku Private Space and mTLS

Due to the increasing number of ransomware and security breaches, premiums for cyber-insurance have gone up by 10 to 40 percent in recent years. Ara Aslanian, CEO of Inverselogic says “Minimum security requirements have definitely changed, especially the way insurers review the companies," Aslanian says. “Previously, the checking process was mostly conducted by self-assessment, which meant the insurers would send companies self-assessment sheets for them to check the boxes.

2021 - The Year of Innovations for "Jobs to be Done"

While COVID-19 continues to cause devastating disruption to the global economy more than two years into the pandemic, it is also continuing to force remarkable innovation across different industries. Companies have found new ways to sell, service and operate during the crisis. For me, there is one common theme for these innovative companies, including Qlik, and it is “Jobs to be Done.”

Connecting to MySQL With Python

The MySQL database is a popular option for storing data. It's powerful, reliable, and easy to use. However, it can be challenging to work with if you don't have the right tools. Luckily, Python has an API for MySQL that makes working with this database simple. Connecting to MySQL with Python code is a great way to build a rich set of data through programming and to create database content quickly.

Looking into 2022: Predictions for a New Year in MLOps

In an era where the passage of time seems to have changed somehow, it definitely feels strange to already be reflecting on another year gone by. It’s a cliche for a reason–the world definitely feels like it’s moving faster than ever, and in some completely unexpected directions. Sometimes it feels like we’re living in a time lapse when I consider the pace of technological progress I’ve witnessed in just a year.

9 Expert Tips for Using Snowflake

Snowflake is a robust data warehouse that has changed the data science game for many organizations. Snowflake lets you analyze your data using the most sophisticated query engine available today with its cloud-native architecture. But using Snowflake is not always as simple as using other products on the market. Below are nine expert tips to help you master the Snowflake platform.

Migrating Data During a Merger or Acquisition

Since I’m now migrating NodeGraph’s processes to Qlik, I thought it may be a good time to talk about migrating data during a merger or acquisition. There are many aspects to consider. Here are some of my thoughts on why companies merge or migrate data landscapes, common M&A migration pitfalls and how to avoid them, the time and cost involved migrating data during a merger or acquisition, and other topics.

How to Start E-Commerce Integration

As the world’s number one CRM provider, Salesforce has been transforming customer relationship management for the last two decades across various e-commerce platforms and e-commerce websites. From increased communication to improved customer relationships to better business plans, the Salesforce platform has helped countless organizations move a step above their competitors.

Operationalize Your Data Warehouse With Reverse ETL

Data warehousing aggregates data from disparate sources so you can run real-time reports for greater business intelligence. However, a data warehouse does more than generate big data analytics. How about using it as a data source rather than just a destination? You can move data from your warehouse to other systems in your networks, such as Salesforce or Zendesk, and improve existing operations.

Why Log Data Retention Windows Fail

If you’re using Elasticsearch as part of an ELK stack solution for log analytics, you’ll need to manage the size of your indexed log data to achieve the best performance. Elasticsearch indices have no limit to their size, but a larger index takes longer to query and is more costly to store. Performance degradation is often observed with large Elastic indices and queries on large indices can even crash Elasticsearch when they use up all of the available heap memory on the node.

Why You Need a Salesforce Uploader Today!

With more than 150,000 customers and millions of users, Salesforce processes a lot of data — accounts, contacts, activities, leads, opportunities, you name it. And people are adding more data to Salesforce all the time. As organizations realize the benefits of customer relations, Salesforce has become the number one destination for sales, marketing, eCommerce, and field service data. That's even more accounts, contacts, activities, leads, and opportunities.

Three reasons you need modern cloud analytics now

Data is everywhere. As the sheer volume and number of data sources continue to explode, so do new opportunities for modern businesses to create and act on insights. That is if they are equipped with the right analytics technology. Historically, many businesses have settled for “good enough” analytics tools, putting up with lackluster bundles from full-stack vendors in an attempt to minimize cost or risk.

How to Learn Python Scripting in 7 Simple Steps

Python is one of the most in-demand programming languages in the world — and for good reason. Knowing how to code has never been so valuable thanks to the expanding world of tech and focus on data science. From landing high-paying jobs to improving your skillset, learning Python scripting can bring you many opportunities to succeed. However, while these opportunities are robust, many challenges come with learning Python.

How To Use Change Data Capture with Integrate.io

Change data capture (CDC) is a crucial, but also tremendously underappreciated, feature that forms the backbone of modern ETL workloads. Without knowing which data has changed since you last accessed it, you’d be forced to extract all the data from a source table or database each time that you perform data integration—which would be a tremendously inefficient process.

Cloudera Data Engineering 2021 Year End Review

Since the release of Cloudera Data Engineering (CDE) more than a year ago, our number one goal was operationalizing Spark pipelines at scale with first class tooling designed to streamline automation and observability. In working with thousands of customers deploying Spark applications, we saw significant challenges with managing Spark as well as automating, delivering, and optimizing secure data pipelines.

Why Understanding Dark Data Is Essential to the Future of Finance

“Water, water, everywhere, nor any drop to drink.” The famous line from Samuel Taylor Coleridge’s epic poem “The Rime of the Ancient Mariner” has a fitting application to today’s data problem. Enterprises are deluged with data, but they often have no way to leverage it. According to most experts, only a small percentage of data is usable and made useful, and most of it is in the dark — thus the term, “dark data.”

New Year, New UI: Get Started in Snowsight

Out with the old; in with the new! If you haven’t already checked out the new Snowflake® interface (aka Snowsight®), make it your New Year’s resolution. Set yourself up for success in 2022 by spending a few minutes getting to know the new features and experiences that are in public preview—available when you click the Snowsight button at the top of your console’s menu bar.

Big Data Meets the Cloud

With interest in big data and cloud increasing around the same time, it wasn’t long until big data began being deployed in the cloud. Big data comes with some challenges when deployed in traditional, on-premises settings. There’s significant operational complexity, and, worst of all, scaling deployments to meet the continued exponential growth of data is difficult, time-consuming, and costly.

Leveraging BigQuery Audit Log pipelines for Usage Analytics

In the BigQuery Spotlight series, we talked about Monitoring. This post focuses on using Audit Logs for deep dive monitoring. BigQuery Audit Logs are a collection of logs provided by Google Cloud that provide insight into operations related to your use of BigQuery. A wealth of information is available to you in the Audit Logs. Cloud Logging captures events which can show “who” performed “what” activity and “how” the system behaved.

10 Best Practices for Building a Good API

APIs are being created faster than ever before with ever-advancing technologies such as node.js and AngularJS. With the flexibility in design and integrations for APIs, there isn't a more exciting time than now to be an API developer. However, with so many new technologies and methods of creating APIs comes the question, "What makes a good API?" While the increase in API creation has many advantages for businesses in multiple areas, there is also more room for low-quality API production.

Recognizing Organizations Leading the Way in Data Security & Governance

The right set of tools helps businesses utilize data to drive insights and value. But balancing a strong layer of security and governance with easy access to data for all users is no easy task. Retrofitting existing solutions to ever-changing policy and security demands is one option. Another option — a more rewarding one — is to include centralized data management, security, and governance into data projects from the start.

Modern Data Stack using Integrate.io for the ELT

Integrate.io is a company that provides an ELT (Extract, Load and Transform) data stack. They can do transformations using DBT, which stands for Database Transformation toolkit. Then they use Integrate.io again to push the data into systems like Salesforce. This system will allow you to have better control over your data and provide a cost-effective solution.

Is SSIS a Good ETL Tool?

ETL (Extract, Transfer and Load) is a well-known data integration process. There is an overwhelming number of tools that you can use (one of which is SSIS) and it can be difficult to choose between them. What exactly is SSIS, and how can it help your company perform ETL better than you ever have before? This article will explain the major features of SSIS, demonstrate the pros and cons of implementing it, and advise as to when you might be better off with a different ETL tool.

Data Goes Around The World In 80 Seconds With Snowflake

See how a database named Phileas Fogg can journey around the world in 80 seconds on Snowflake in this animated short. With Snowflake, PHILEAS_FOGG can failover in the event of disruption to enable continuous business operations and be joined with local data sets for global data collaboration across clouds.

Will cloud ecosystems finally make insight to action a reality?

For decades, the technologies and systems that deliver analytics have undergone massive change. What hasn’t changed, however, is the goal: using data-driven insights to drive actions. Insight to action has been a consistent vision for the industry. Everyone from data practitioners to technology developers have sought this elusive goal, but as Chief Data Strategy Officer Cindi Howson points out, it has remained unfulfilled — until now.

Announcing Our $4M Seed and Continual Public Beta

Today we’re excited to announce the public beta launch of Continual, the first operational AI platform built specifically for modern data teams and the modern data stack. We’re also announcing our $4M Series Seed, led by Amplify Partners, and joined by Illuminate Ventures, Wayfinder, DCF, and Essence, as well as new partnerships with Snowflake and dbt Labs.

How to migrate an on-premises data warehouse to BigQuery on Google Cloud

Data teams across companies have continuous challenges of consolidating data, processing it and making it useful. They deal with challenges such as a mixture of multiple ETL jobs, long ETL windows capacity-bound on-premise data warehouses and ever-increasing demands from users. They also need to make sure that the downstream requirements of ML, reporting and analytics are met with the data processing.

Unlocking Data Literacy Part 2: Building a Training Program

As we head into the holidays, there’s no better time to talk about bringing people together. And there’s no better way to bring employees together within a company aspiring to be data-driven than with a data literacy program. What data analytics processes should your organization put into place to increase data literacy? It all starts with establishing a training program to empower your people to work with data, regardless of their level of expertise.

What is Amazon Redshift Spectrum?

Amazon S3 (Simple Storage Service) has been around since 2006. Most use this scalable, cloud-based service for archiving and backing up data. Within 10 years of its birth, S3 stored over 2 trillion objects, each up to 5 terabytes in size. Enterprises value their data as something worth preserving. But much of this data lies inert, in “cold” data lakes, unavailable for analysis. Also called “dark data”, it can hold key insights for enterprises.

Redshift Join: How to use Redshift's Join Clause

Redshift’s JOIN clause is perhaps the second most important clause after SELECT clause, and it is used even more ubiquitously, considering how interconnected a typical application database’s tables are. Due to that connectivity between datasets, data developers require many joins to collect and process all the data points involved in most use cases. Unfortunately, as the number of tables you’re joining in grows, so does the sloth of your query.

What Are The Best ETL Tools For Vertica?

Vertica claims to offer the "most advanced unified analytical warehouse" in the world, providing actionable data insights you can't find anywhere else. The truth is, like any data warehouse, Vertica is only as good as the data you put into it. Moving data to Vertica can be a headache for organizations without a data engineering team. Data might live in various locations — transactional databases, relational databases, customer relationship management (CRM) systems, you name it.

PostgreSQL to Amazon Redshift: 4 Ways to Replicate Your Data

PostgreSQL is the preferred platform of millions of developers around the world. The open-source tool is one of the most powerful databases on the planet, with the ability to handle sophisticated analytical workloads and high levels of concurrency. That makes PostgreSQL (also called Postgres) a popular DB for scientific research and AI/ML projects. It’s also a popular production database for data-driven companies in every industry. But no database is perfect.

Adopting a Production-First Approach to Enterprise AI

After a year packed with one machine learning and data science event after another, it’s clear that there are a few different definitions of the term ‘MLOps’ floating around. One convention uses MLOps to mean the cycle of training an AI model: preparing the data, evaluating, and training the model. This iterative or interactive model often includes AutoML capabilities, and what happens outside the scope of the trained model is not included in this definition.

SaaS in 60 - Catalog KPI and Qlik Lineage Connectors

Catalog KPIs: These KPIs help you understand key metrics of apps, data, notes, automations and monitored charts viewable in the catalog. The indicators represent usage and views of each item such as how many apps are using a particular data set, what items are being used most- including a trend indicator showing more, less or no change in views over a 28 day period.

Why a Data Lakehouse alone is not the answer to modern analytics

Can the Lakehouse meet all your analytics needs or do you need a Data Lake and a Data Warehouse working in parallel? Join us on this live stream to learn when one works better than the other, or, do you really need the combination to win? Our speakers David, Justin, and Chris will debate the different use cases and architectures to determine what is necessary for a data-driven business.

The Ultimate Guide to Redshift ETL: Best Practices, Advanced Tips, and Resources for Mastering Redshift ETL

Amazon Redshift makes it easier to uncover transformative insights from big data. Analytical queries that once took hours can now run in seconds. Redshift allows businesses to make data-driven decisions faster, which in turn unlocks greater growth and success. For a CTO, full-stack engineer, or systems architect, the question isn’t so much what is possible with Amazon Redshift, but how. How do you ensure optimal, consistent runtimes on analytical queries and reports?

AWS Redshift Pricing: How much does Redshift cost?

While Redshift is arguably the best data warehouse on the market, it can come with a hefty price tag. We’ve created this Redshift pricing guide to help you evaluate Redshift cheaply, create a budget for full implementation, and optimize your Redshift set up so that you’re getting the most bang for your data buck. Ready to get started? Think of this blog post as a “choose your own adventure” guide.

Microsoft Azure vs Amazon Redshift

When choosing any SaaS application, you must start with a clear understanding of your business requirements. Then ask yourself the following questions: Develop a framework for data processing requirements, and you'll find a data warehouse solution that provides the right amount of power, functionality, and high performance for data analytics. Keep the answers to these questions in mind when reading through this article.

Stitch builds on its Microsoft technology partnership

Stitch is pleased to announce the availability of Microsoft SQL Server as a destination. MS SQL Server joins nine other data destinations (including Microsoft Azure Synapse) that Stitch supports to help execute all your data modeling and analysis projects. Stitch customers can immediately benefit from the new destination, which supports both Azure SQL Server and standard SQL Server editions reaching as far back as SQL Server 2012.

Design With Analytics in Mind for Data Governance

The following is Part III of a three-part series. Welcome to the final installment of a three-part series discussing the areas to take seriously when you want to drive business with analytics. In Part I of this series, I discussed how to prioritize data accessibility and how to address the challenges that come with it. Those challenges include: Part II discussed where the disconnect is and addressed how organizations can bridge the gap.

Data Hub, Fabric or Mesh? Part 1 of 2

Over the course of my next two blog posts, I would like to share my thoughts around a debate raging in data architecture circles. The bone of contention? That the 21st century needs a new data management paradigm for modern analytics. First up, I’ll frame the argument and explain the two prominent approaches of data hub and data fabric. Then, I’ll cover data mesh and compare all three architectures. As always, I’d love to get your input, feedback, queries and comments!

Scaling NLP Pipelines at IHS Markit - MLOps Live #17

The data science team at IHS Markit will be sharing practical advice on building sophisticated NLP pipelines that work at scale. Using a robust and automated MLOps process, they run complex models that make massive amounts of unstructured data searchable and indexable. In this session, they will share their journey with MLOps and provide practical advice for other data science teams looking to.

CDP on Azure: Harnessing the Power of Data Flow and Event Processing

Data is being created at an ever increasing rate and generating insights through event streams has become a critical function for businesses. How can we process this data flowing in the enterprise, evaluate, enrich and transform it, all in real time to enable fast analytics to support intelligent decision making? Join us for this session where we will look at how we can use the elastic nature of Azure to scale Data Flows and perform SQL operations in realtime on streaming data from a variety of sources.

Introducing Continual Integration for dbt

Today we’re pleased to announce Continual Integration for dbt. We believe this is a radical simplification of the machine learning (ML) process for users of dbt and presents a well-defined path that bridges the gap between data analytics and data science. Read on to learn more about this integration and how you can get started.

AI and ML: No Longer the Stuff of Science Fiction

Artificial Intelligence (AI) has revolutionized how various industries operate in recent years. But with growing demands, there’s a more nuanced need for enterprise-scale machine learning solutions and better data management systems. The 2021 Data Impact Awards aim to honor organizations who have shown exemplary work in this area.

Fighting Financial Crime and Earning Trust Using Data-Driven Compliance

One of the most challenging and complex elements of operating a financial services institution is compliance. Managing risk, security and privacy to earn customers’ trust has long been at the core of financial services, but this foundation has been shaken over recent years.

4 Tips for Recognizing and Avoiding Analytics Bias

One of the key cornerstones of the emerging field of ethical, explainable AI is recognizing and avoiding bias. As AI takes on a greater role in organizations with sometimes opaque calculations, there is an increased urgency in many businesses to get ahead of these challenges, and companies, such as IBM, Salesforce and Microsoft, have already added roles specifically with the aim of ensuring that ethics are a key consideration of AI.

A Finance Leader's Guide to Data Modernization

In today’s tech-forward companies, CFOs are tasked with managing and overseeing an increasingly expansive domain of systems and technologies to thrive. The rise of regulatory considerations, novel market drivers and a globally connected business environment is creating an entirely new set of pressures on both the structure of the department and on leadership.

Qlik Reporting Service - Brief Overview and Quick Demo - Part 1

This video provides a brief introduction of the Qlik Reporting Service and quick demo. More detailed demonstrations, best practices and tips when using it with Qlik Application Automation and Qlik Sense are in part 2 of this video: This initial release of the Qlik Reporting Service provides multi-channel, multi-page report output distribution and delivery of Qlik Sense insights to your organization either using a public API or Reporting Service connectors available in Qlik Application Automation.

Qlik Reporting Service - Build-out Demonstration and Bursting Example - Part 2

This video provides more detailed build-out demonstrations, best practices and tips when using the Qlik Report Service with Qlik Application Automation and Qlik Sense. This initial release of the Qlik Reporting Service provides multi-channel, multi-page report output distribution and delivery of Qlik Sense insights to your organization either using a public API or Reporting Service connectors available in Qlik Application Automation.

Qlik Reporting Service - Brief Overview with Detailed Demonstrations - Part 1 and Part 2

Chapter Index Below! This video provides a brief introduction of the Qlik Reporting Service and more detailed demonstrations, best practices and tips when using it with Qlik Application Automation and Qlik Sense. Contains both Part 1 (short overview / brief demo) and Part 2 (more detail buildout and usage) This initial release of the Qlik Reporting Service provides multi-channel, multi-page report output distribution and delivery of Qlik Sense insights to your organization either using a public API or Reporting Service connectors available in Qlik Application Automation.

Australia's Department of Skills & Education and MinterEllison Discuss Our Digital Future

As businesses and governments worldwide struggle with the challenges of the current pandemic, making rapid use of real-time data has proven to be a very powerful tool. This video takes a look at how data is being used in organizations across Australia including the Department of Skills and Education, and MinterEllison.

Automating MLOps for Deep Learning

MLOps holds the key to accelerating the development, deployment and management of AI, so that enterprises can derive real business value from their AI initiatives. Deploying and managing deep learning models in production carries its own set of complexities. In this talk, we will discuss real-life examples from customers that have built MLOps pipelines for deep learning use cases. For example, predicting rainfall from CCTV footage to prevent flooding.

Getting Started with CI/CD and Continual

While CI/CD is synonymous with modern software development best practices, today’s machine learning (ML) practitioners still lack similar tools and workflows for operating the ML development lifecycle on a level on par with software engineers. For background, follow a brief history of transformational CI/CD concepts and how they’re missing from today’s ML development lifecycle.

Stitch vs. Fivetran vs. Xplenty: A Comprehensive Comparison

When it comes to providing the latest and greatest ETL and ELT tools, the platforms Stitch, Fivetran, and Xplenty are all top contenders. That being said, each platform also has its own set of pros and cons. Ultimately, the best ETL/ELT platform for your company will largely depend upon the needs of your organization. So, which platform will reign supreme for your company in the Stitch vs Fivetran vs Xplenty matchup?

New Snowflake Features Released In October And November 2021

Coming off of our Snowday event, we’ve unveiled a number of new product capabilities that expand what is possible in the Data Cloud. From helping businesses operate globally with improved replication efficiency, empowering developers with new functionality in Snowpark, and improving the security and governance of data through native object tagging, there is no shortage of exciting advancements coming to Snowflake.

Anodot and Rivery Demo New Marketing Analytics Kit

Marketing teams routinely struggle with monitoring the performance and cost of their ad campaigns. Now, they have a solution that can be as easy as just a few clicks. We recently joined our partners at Rivery for a webinar demonstrating the new Anodot Markering Analytics Monitoring Kit. The kit allows users to track marketing campaigns in real-time and take the action needed to make the most of ad spend.

What Is Snowflake?

As a company’s data assets grow, the need for cloud computing increases in tandem. For keeping pace with this growth, Snowflake stands above the rest. What makes Snowflake so special? This cloud-agnostic platform takes the best of traditional database technology and combines it with modern cloud computing to drive the agility and innovation companies need to remain competitive. It features on-the-fly scaling, flexible clustering options, and the capability to hold several petabytes of information.

It's An Exciting Future in Music When We Have Clean Data

Hello, I’m Imogen Heap – musician and tech founder. Now, I know what you might be thinking. Why is a musician and composer talking to me about data? Well, for any of you that follow my work, you will know that I am a bit of a data nerd, and that’s why I loved speaking to Joe DosSantos on the latest Data Brilliant podcast. A lot of my music is powered by data. In fact, Joe and I discussed my love for how technology and data inspires me to be more creative. Want to know how?

The modern data stack is broken. It's time for Data stack as a service (DStaaS).

Yes, I’ve said it. The modern data stack is a pain to work with. But it wasn’t always like that. As companies realized they can leverage data to accelerate growth new data tools were invented. From NoSQL databases that specialize in processing specific data structures (graph anyone?) to the Python-Pandas-like Spark ecosystem that allows you to run queries on Big Data (capital B, mind you). But with every new tool added to the data stack, the complexity increased.

Business Analytics: The Future Is AI and It Is Here

Business analytics (BA) is the process of evaluating data in order to gauge business performance and to extract insights that may facilitate strategic planning. It aims to identify the factors that directly impact business performance, such as ie. revenue, user engagement, and technical availability. BA takes data from all business levels, from product and marketing, to operations and finance.

Best ETL Tools for Heroku

Heroku leverages the open-source technology of PostgreSQL to deliver a powerful and reliable database-as-a-service solution. Heroku Postgres acts as both a source and destination for data integration. With native support for a variety of programming languages and features that ensure security and compliance, it is a top resource for many companies. What do you need to know about Heroku, and what tools should you use to perform ETL (extract, transform and load) and ELT (extract, load and transform)?

Driving Industry Transformation Through the Use of Data

As organizations look to improve business operations and outcomes, global industries are pushing for data-driven transformation. The 2021 Cloudera Data Impact Awards recognize those organizations that have pulled ahead of the pack with efforts to leverage the power of data to improve operations and better serve their customers. The finalists in the “Industry Transformation” category are MTN, National Payments Corporation of India (NPCI), Sberbank, and Bank Negara Indonesia (BNI).

Delivering High Performance for Cloudera Data Platform Operational Database (HBase) When Using S3

CDP Operational Database (COD) is a real-time auto-scaling operational database powered by Apache HBase and Apache Phoenix. It is one of the main Data Services that runs on Cloudera Data Platform (CDP) Public Cloud. You can access COD right from your CDP console. With COD, application developers can now leverage the power of HBase and Phoenix without the overheads related to deployment and management.

The Developer Playground now supports REST API

One of the best features in ThoughtSpot Everywhere is the Developer Playground. The Playground lets frontend Developers visually configure elements and generate JavaScript code to add into your web app. It is an amazing tool for testing and iterating configuration options before adding final elements such as Search, Liveboards, and visualizations into your web app. But what about a backend Developer who might be building solutions that utilize the Platform’s APIs?

IT Professionals Reveal Cloud Data Platform Highs and Lows of 2021

Wondering whether your struggles with the data lake, cloud data platform, or analytics at large are typical? Are you ahead or behind the curve? ChaosSearch recently commissioned a survey to understand the advantages and setbacks organizations face today in these areas, and we’re excited to share a sneak peek of the results. To uncover more detailed findings from our research, sign up to receive the full report once it’s available here.

Customer 360: Explained

Imagine synchronizing every piece of customer data from across your organization into a single integrated platform. You could generate a panoramic view of consumer activities and develop an understanding of what your customers really want. Customer 360 helps you do all of that. It's the practice of integrating customer data from multiple sources so you can deliver better experiences for every person who engages with your organization. Below, learn about customer 360, how it works, and its benefits.

Simplifying Use of External APIs with Request/Response Translators

Snowpark has generated significant excitement and interest since it was announced. Snowpark is a developer framework that enables data engineers, data scientists, and data developers to code in their language of choice, and execute pipelines, machine learning (ML) workflows, and data applications faster and more securely. While many parts of Snowpark are in preview stages, External Functions entered General Availability earlier this year.

Creating the Ultimate Analytics Stack with Moesif and Datadog

When looking at API analytics and monitoring platforms, many seem to be so similar that it’s hard to figure out the differences between them. We often hear this confusion from users and prospects. In a world with so many tools available, how do we figure out which ones are necessary and which are redundant? One of the most common questions we are asked revolves around how Moesif compares to Datadog and how they could work together.

How Hybrid and Cloud-Based Architectures are Unlocking the Power of Data

It takes vision, purpose, and skill to unlock the power of data. It also takes the right strategy. For ExxonMobil, Ares Trading (Merck), and the University of California San Diego (UCSD), the right strategy is taking full advantage of the cloud. All three organizations have partnered with Cloudera, leveraging a hybrid or cloud-based architecture to improve the lives of the people who depend on their organizations’ data.

A Glimpse Into How AI Is Modernizing Data for the Financial Services Industry

Organizations in the financial services sector face a unique set of challenges as they consider how to wrangle and process the vast amount of data they collect. During our Financial Services Summit, I was lucky enough to speak to Brian Anthony, chief data officer for the Municipal Securities Rulemaking Board (MSRB), to learn how the MSRB is integrating technologies such as artificial intelligence (AI) and machine learning to modernize its data.

Analysts Can Now Use SQL to Build and Deploy ML Models with Snowflake and Amazon SageMaker Autopilot

Machine learning (ML) models have become key drivers in helping organizations reveal patterns and make predictions that drive value across the business. While extremely valuable, building and deploying these models remains in the hands of only a small subset of expert data scientists and engineers with deep programming and ML framework expertise.

How Snowflake Support Is Continuously Improving the Customer Experience

At Snowflake, putting the customer first is an essential company value. But “customer-centric” is more than just a buzzword: We use a data-driven, outside-in lens on everything we do, at all levels of the company. In particular, here’s how Snowflake Support is listening to you—our customers—and continuously improving the Snowflake customer experience at every touchpoint.

Qlik and UiPath - The Power of Active Intelligence and Enterprise Workflows for Action

The business world is rapidly pivoting all the time. Strategic shifts, reprioritization and being first all require being smart while moving fast. The value of agility has never stood out more due to the need to react to new realties in everything from public health, remote and in-office business policies and workflows, to broader economic concerns like supply chain as we move into recovery and revitalization.

How Did Your Paid Marketing Channels Perform on Black Friday?

It's a week after Black Friday, and the results are in! While online spending didn't break a new record this year, it still totaled a massive $8.9 billion, making Black Friday one of the biggest sales days ever for digital merchants. Online sales were even healthier on Cyber Monday, totaling $10.7 billion. But how did paid marketing contribute to all those holiday shopping season sales? Some e-commerce retailers struggle to measure the effectiveness of their paid campaigns.

Best ETL tools for Snowflake

ETL (extract, transform, load) is the backbone of modern data integration, efficiently migrating massive quantities of information into a data warehouse like Snowflake. But with so many Snowflake ETL tools on the market these days, how can you choose the best for migrating your data? Below we’ll discuss our favorite Snowflake ETL tools, including their pros, cons, and user reviews so that you can make the choice that’s right for your situation.

In AI we trust? Why we Need to Talk About Ethics and Governance (part 2 of 2)

In part 1 of this blog post, we discussed the need to be mindful of data bias and the resulting consequences when certain parameters are skewed. Surely there are ways to comb through the data to minimise the risks from spiralling out of control. We need to get to the root of the problem. In 2019, the Gradient institute published a white paper outlining the practical challenges for Ethical AI.

Keboola vs Azure Data Factory: The 8 critical differences

ETL pipelines help companies extract, transform, and load data so it is ready to provide insights and value to the company. But running a smooth data operation depends on building reliable and scalable data ingestion pipelines. SaaS vendors like Keboola and Azure Data Factory take away the heavy lifting.

5 Best Practices for Building a Successful Startup

There's never been a better time to be an entrepreneur looking for investment funding. Global venture capital activity grew mightily in the first half of 2021, and the trend appears to be continuing as we head into 2022. However, that doesn't mean building a new company is any easier. The same inherent resource and growth challenges exist, and venture capitalists still want to see value creation and strong indicators for future success before they invest.

Taking a Closer Look at Snowflake Software

As the amount of big data generated every year grows exponentially, successful enterprises have ditched on-premise solutions for the cloud. Nowhere is this more apparent than in the world of big data analytics, where virtual environments offer more scope and scalability than in-house architecture. Data-driven companies crave cloud databases because of the copious amount of data they collect, process, share and analyze every single day.

Create your Private Data Warehousing Environment Using Azure Kubernetes Service

For Cloudera ensuring data security is critical because we have large customers in highly regulated industries like financial services and healthcare, where security is paramount. Also, for other industries like retail, telecom or public sector that deal with large amounts of customer data and operate multi-tenant environments, sometimes with end users who are outside of their company, securing all the data may be a very time intensive process.

Future of Data Meetup: Future of data and analytics in the Hybrid & Multi Cloud

The most valuable and transformative business use cases require multiple analytics workloads and data science tools and machine learning algorithms to run against the same diverse data sets. It’s how the most innovative enterprises unlock value from their data. Turning data into useful insights is not easy, to say the least. The workloads need to be optimised for hybrid and multi-cloud environments, delivering the same data management capabilities across bare metal, private and public clouds. In this session, we will discuss how businesses can leverage the combination of best-in-class software and public cloud to help businesses turn raw data into actionable insights, without the overheads and without compromising performance, security and governance.

Easier administration and management of BigQuery with Resource Charts and Slot Estimator

As customers grow their analytical workloads and footprint on BigQuery, their monitoring and management requirements evolve - they want to be able to manage their environments at scale, take action in context. They also desire capacity management capabilities to optimize their BigQuery environments. With our BigQuery Administrator Hub capabilities, customers can now better manage BigQuery at scale.

API Automation: What You Need to Know

IT automation is an essential business best practice that has enabled countless organizations to become more efficient. According to a survey by Salesforce, 95 percent of executives and directors see the value of automation, and 88 percent want to pursue automation as a key investment for their business. This general tenet holds for the specific case of API automation as well. APIs are a highly underrated, yet critical, technology that underpins the modern digital ecosystem.

Take control of your data with Stitch Unlimited

Without data, your business cannot survive. You need it to understand your customers, to inform your product development, and to plan the future of your company. So why would you tolerate technology that artificially throttles access to your own data and stunts your company’s growth? It just doesn't make sense. That’s why Talend is bringing you Stitch Unlimited and Stitch Unlimited Plus.