Systems | Development | Analytics | API | Testing

November 2022

Data modeling techniques for data warehousing

When setting up a modern data stack, data warehouse modeling is often the very first step. It is important to create an architecture that supports the data models that you wish to build. I often see people going straight to writing complex transformations before thinking about how they want to organize the databases, schemas, and tables within their warehouse. To succeed, it is key to design your data warehouse with your models in mind before starting the modeling process.

Transaction Support in Cloudera Operational Database (COD)

CDP Operational Database enables developers to quickly build future-proof applications that are architected to handle data evolution. It helps developers automate and simplify database management with capabilities like auto-scale, and is fully integrated with Cloudera Data Platform (CDP). For more information and to get started with COD, refer to Getting Started with Cloudera Data Platform Operational Database (COD).

How to Deploy Transaction Support on Cloudera Operational Database (COD)

Cloudera Operational Database enables developers to quickly build future-proof applications that are architected to handle data evolution. It helps developers automate and simplify database management with capabilities like auto-scale, and is fully integrated with Cloudera Data Platform (CDP). For more information and to get started with COD, refer to our article Getting Started with Cloudera Data Platform Operational Database (COD).

Česká spořitelna: How the Biggest Czech Bank Builds Data Products in Days Instead of Weeks

Česká spořitelna is the biggest Czech retail bank with 4.5 million clients across 400 branches. Running a bank of this size brings its own data challenges from strict regulatory compliance via a wide range of data management needs, to almost limitless product possibilities within the data-rich environment.

How You Can Contribute to ClearML's MLOps Platform

ClearML is an open source MLOps platform, and we love the community that’s been growing around us over the last few years. In this post, we’ll give you an overview of the structure of the ClearML codebase so you know what to do when you want to contribute to our community. Prefer to watch the video? Click below: First things first. Let’s take a look at our GitHub page and corresponding repositories. Later on, we’ll cover the most important ones in detail.

Data Governance Framework Policy - What Do You Need to Know?

According to IDCs Global Datasphere, 64.2 ZB of data was created in 2020 alone. This number is projected to grow by 23% annually from 2020-2025. Therefore, we need data governance frameworks for efficient data management and control. This will help us extract maximum value out of such high volumes of data. Such frameworks would be required for data integrity, data protection, and data security. Indeed, according to BDO, the average data breach cost has been estimated to be around USD 3.8 million.

What is Data Governance? Accountability and Quality Control in Analytics

Effective control and governance over your data assets is vital for long-term business success. By keeping your data available, reliable and usable for analysis, consumption and sharing, you can ensure data quality, data security and data reliability is consistently met. However, many organizations today struggle to implement governance frameworks over their data, which has once again highlighted the importance of data governance. So, what is data governance?

How to Benefit from A/B Testing on Mobile

A/B testing is the most effective way to observe users’ behavior with two or more different versions of the same screen or in-app experience. It can help you test variations of an item, generally UI based, to determine which one performs better. Directing some users to version A and others to version B allows you to observe user behavior for each.

Creating a Successful Data Journey Through Enhanced Customer Experience

When I first became an industry analyst, I was fascinated with vendor focus on key capability sets and engineering. The market shifted between acquisition cycles, new start ups, and development cycles for a go-to market strategy based on differentiated tools and capability sets. All of this jockeying for position based on feature sets created strong technology stacks that paved the way for a platform approach and eventual modernization and the transition to cloud.

Qlik and Confluent Together Deliver Real-Time Data Streaming To Kafka

Qlik Data Integration enable you to automatically produce real-time transaction streams into Kafka. Take advantage of modern analytics and microservices, enabling streaming analytics as well as streaming ingestion into data lakes and data warehouse platforms. And unlock the potential of data from legacy systems with microservices environment integrations.

How to maximize your cloud modernization mojo

Cloud migration is a top initiative of organizations all over the world today. But how can businesses make sure that their migration efforts are not only successful, but that they are making the most of data in the cloud? Join this session to hear how Talend, AWS, and Snowflake together can help organizations in their cloud journey to truly modernize their systems and applications — and reap the benefits of data while optimizing data management in the cloud.

Qlik Helps Deliver Greater Value From Databricks Lakehouse

To build a high-performing data lake, you need a solution for handling the labor-intensive manual engineering tasks that have traditionally slowed data delivery to a crawl. In other words, you need automated data integration, transformation, and cataloging. And that’s exactly what Qlik provides, when you use Qlik Data Integration with Databricks.

How to leverage automation with integrated system data

The automation of common business practices is widely considered to represent the future for many industries, with 66% of modern organizations reportedly in the process of automating one or more core business functions, indicating an adoption rate growth of 9% between 2018 and 2020 alone. By automating the more time-consuming and repetitive aspects of a business' daily operations, companies are able to better allocate time to the most complicated facets of their roles, and with the added benefit of intelligent data analysis, well implemented automation can vastly improve efficiency.

Why the Data Warehouse is Not Dead and Stronger Than Ever

This is a guest post for Integrate.io written by Bill Inmon, an American computer scientist recognized as the "father of the data warehouse." Inmon wrote the first book and first magazine column about data warehousing, held the first conference about this topic, and was the first person to teach data warehousing classes. Five things you need to know about this topic: The data warehouse is the whack-a-mole of technology.

27 Blogging Statistics That Will Shape Your Content Strategy in 2023

Even though some have (falsely) predicted that the rise of social media would mean the end for blogs — blogging is still very alive and kicking: according to Ahrefs, about 22,000 people in the US search Google for “How to start a blog” each month. Still, blogging has changed quite a bit in the last decade and it requires more forethought and planning than it did in the days you could simply throw up a basic website, write whatever you wanted, and still get a decent amount of traffic.

Eneria trailblazes real-time data synchronization with Talend Change Data Capture (CDC)

Eneria, a subsidiary of the Monnoyeur Group, is a leader in energy production and motorization solutions. As the exclusive Caterpillar dealer in France, they have developed notable and unique expertise in the field of generator sets, Caterpillar generators, inverters, and engines. As a well-established industry leader with over 870 employees across dozens of departments, Eneria knows the pivotal importance of interdepartmental communication — not just between employees, but between data.

Is Data Mesh the Right Framework for Your Data Ecosystem?

With the ever-increasing volume of data being generated from a highly diverse set of data sources, organizations have started to increasingly direct their focus on solutions that can help them with data management more efficiently and effectively. Indeed, in the current decade, having a robust data infrastructure is key to an organization’s success, and timely data-driven decision-making is what every management is striving for today.

Selling In a Downturn Requires Doing More with Data

According to the Chief Economists Outlook published by the World Economic Forum in September 2022, 7 out of 10 economists now consider a global recession at least somewhat likely. The sales organization has a front seat view to the impacts of recession as sales personnel hear their customers cite budget cuts, postpone purchases, and possibly look for less expensive alternatives.

The 7 Top Data Engineering Tools in 2023

What’s the difference between knowledge and wisdom? Knowledge is knowing Excel is Turing-complete. Wisdom is not using Excel as your database. The best data engineering tools make your life easier. Speed up processes, simplify complex operations, give you insights into the machinery, and maybe save some $ along the way. In this article, we’ll give you an overview of the 7 best data tools for data engineering use cases.

Integrating Your Data Warehouse and Data Mesh Strategies

Data warehousing requires data centralization, whereas data mesh enables a decentralized approach to data access. Organizations might think that the solution to their data management strategy requires a choice between the two, but the reality is that both approaches can and should co-exist.

Does the Data Warehouse Sit on a Single Physical Database?

This is a guest post for Integrate.io written by Bill Inmon, an American computer scientist recognized as the "father of the data warehouse." Inmon wrote the first book and first magazine column about data warehousing, held the first conference about this topic, and was the first person to teach data warehousing classes. Five things to know about this topic.

The 6 Best FREE Open-Source ETL Tools in 2023

Data integration can be a daunting task, and data engineers usually prefer open-source ETL solutions because of their transparency (you can always inspect the code), flexibility (tinker with the tool), and price performance (no vendor licenses, no maintenance fees). But there are many “gotchas!” with open-source tools you need to consider before picking the best tool for the job.

Business Monitoring with ThoughtSpot

Business monitoring is essential to a company’s success. Whether you’re improving efficiency, saving costs, planning inventory, or tracking goals, you need to define metrics and monitor them regularly to make progress. With ThoughtSpot, business monitoring is an intuitive experience that starts with visualizing your KPIs in real-time so you can take action when there’s movement.

Deploying Your Hugging Face Models to Production at Scale with MLRun

Hugging Face is a popular model repository that provides simplified tools for building, training and deploying ML models. The growing adoption of Hugging Face usage among data professionals, alongside the increasing global need to become more efficient and sustainable when developing and deploying ML models, make Hugging Face an important technology and platform to learn and master.

No-code/low-code cloud data mapping with Talend

Mapping source columns with a data destination is arduous and time-consuming. Data fields can come from many source types and formats. Even though you are the expert on your datasets, you may require the assistance of IT to set up the mapping. However, the more sources, the more handoffs, the higher the possibility of errors. As organizational data has become more dispersed and voluminous across organizations and applications, it's more important than ever to ensure that you understand your data.

The 7 Costly and Complex Challenges of Big Data Analytics

re:Invent 2022 is just around the corner and we couldn’t be more excited to share the latest ChaosSearch innovations and capabilities with our current and future customers in the AWS ecosystem. Enterprise DevOps teams, SREs, and data engineers everywhere are struggling to navigate the growing costs and complexity of big data analytics, particularly when it comes to operational data.

Transforming Business Through The Power Of Data

Srini Nachinarkiniar, Accenture's Global Snowflake Practice Lead, sees every business as a data business. In this interview with Data Cloud Now host Ryan Green, Trini expands on this philosophy and discusses its three core principles: That data provides a strategic core, the importance of building an intelligent data foundation, and finding the right talent to establish a data culture.

Democratizing marketing data with BigQuery and Looker

Welcome back to the marketing analytics series where we teach practitioners about Google Cloud marketing analytics solutions. In this video, Kelci shows how you can democratize marketing data with BigQuery and Looker. Watch to see how you can use Looker Blocks and Looker Action to help your company achieve a successful marketing mata strategy and share insights across your organization. Chapters.

The future of data - Tom Edwards

This episode features an interview with Tom Edwards, Chief Digital and Data Officer at Omnicom Health Group, the largest healthcare marketing and communications network in the world. Prior to Omnicom, Tom served as Chief Digital and Innovation Officer at Epsilon. Tom has been named one of the Top 50 Most Influential Business Leaders in Technology and a Top 10 Global Marketer Award winner by OnCon this year. On this episode, Tom talks about transparency in decision making, how to organize massive amounts of data in order to derive insights, and how to determine the ideal communication strategy for a target audience.

Once Upon a Time in the Land of Data

I recently had the privilege of attending the CDAO event in Boston hosted by Corinium. Tracks represented financial services, insurance, retail and consumer packaged goods, and healthcare. Overall, it struck me that while data science is not new, most firms are still defining the mission of the data office and data officer. It’s clear firms seek to leverage data and embrace its potential insights, but most are forging ahead in largely uncharted territory.

Interview With Hikari Senju, Founder and CEO of Omneky

For the next interview in our series speaking to technical leaders from around the world, we’ve welcomed Hikari Senju, Founder and CEO of Omneky. Hikari Senju is the founder and chief executive officer of Omneky, an AI platform for generating, analyzing, and optimizing personalized ads at scale.

Do More With Yellowfin: November 2022 Yellowfin User Group Event Recap

Welcome to our Yellowfin Customer User Group Event event recap for November 2022! Following our quarterly webinar held November 8 - 9, this blog provides a helpful summary for our customers that could not attend to catch up on new updates from the company, including our future product roadmap and extended support of the Yellowfin embedded analytics suite.

How ClearML Helps Daupler Optimize Their MLOps

We recently had a chance to catch up with Heather Grebe, Senior Data Scientist at Daupler, which offers Daupler RMS, a 311 response management system, used by more than 200 cities and service organizations across North America and internationally. This platform helps utilities, public works, and other service organizations coordinate and document response efforts while reducing workload and collecting insights into response operations.

Move Data from Google Sheets to BigQuery

How do you link BigQuery with Google Sheets? Importing a table from Google BigQuery to Google Sheets is quite easy. In fact, you can connect BigQuery tables directly to the Google Sheets spreadsheet with a couple of clicks using Connected Sheets, the new Google BigQuery data connector (check the full instructions here). But what if you want to move data the other way around? That is, how do you send Google Sheets data to your Google BigQuery data warehouse? Now your life becomes a bit more complicated.

The Ultimate Data Lineage Guide

There is a famous saying that goes by: Coincidently, this is also true for data in modern times. The information which we see in pretty reports and charts or is displayed to users via an application has actually experienced a long run of data processing and transformations. These transformations are a result of well-planned ETL pipelines and data management strategies. Originating from different touchpoints, data witnesses several alterations throughout its lifecycle, such as.

New Pentaho Enterprise Edition 9.4 Delivers on Seamless Hybrid Cloud Data Management

When Pentaho Enterprise Edition (EE) 9.3 was released in May it represented a significant step in the journey towards a seamless hybrid cloud solution – one that simplified decisions and enabled customers to manage all their data operations for various use case scenarios.

Spend Less Time on Report Creation and More Time on Analysis

You know the old saying “work smarter, not harder?” Turns out that’s easier said than done. How can your finance team transform the way it works and add strategic value to your organization? How can you shift your focus from menial tasks to tactical execution, and ultimately from tactical to strategic activities? In most companies, financial reporting consumes an inordinate amount of time and energy.

An analytic engineering approach to self-service analytics: dbt + ThoughtSpot

In 1987, economist Robert Solow declared, “You can see the computer age everywhere but in the productivity statistics.” He noted that despite massive investments in computer hardware and software, companies saw a decrease in fundamental productivity measures.

The Dataiku Mission To Democratize The Use Of AI-Driven Data

How well are companies doing in providing their employees with the tools and training needed to deliver value from AI? In this episode of “Data Cloud Now,” Shaun McGirr of Dataiku addresses that question and shares the results of a recent survey his company conducted that suggests there’s much more work to be done to deliver on the aspirational goal of making AI usable and accessible across organizations. He also discusses the excitement that Snowpark is generating among those at the leading edge of AI.

SaaS in 60 - Qlik Cloud and HIPAA Compliancy

Qlik is now equipped to help customers meet their HIPAA regulatory requirements. US Healthcare organizations can now take full advantage of Qlik Cloud to enhance patient outcomes, improve service delivery, and close the gaps between data insights and actions. Qlik has completed the SOC2 Type 2 + HITRUST Attestation and have recently launched Customer Managed Keys, an additional security offering that allows customers to retain control of their data’s encryption when stored at rest in Qlik Cloud.

Episode 7 | Data Lifecycle | 7 Challenges of Big Data Analytics

What is a data lifecycle? From birth to death, from source to destination, data seems to always be on a journey. If storage and compute were free or there were no laws like the “Right to be Forgotten” within policies such as “General Data Protection Regulation” or GDPR for short, organizations might never delete information. However, at scale data gets extremely expensive and customers do have liberties with regards to governance and sovereignty. Often it is the case that platforms have whole controls and procedures around the lifecycle of data. And in this episode, we will focus on the complexity of scale when it comes to the day in the life of data.

BigQuery object tables in a minute

Are you working with separate systems to analyze structured and unstructured data? Introducing BigQuery object tables, a new type of table in BigQuery that provides a structured record interface for unstructured data in Google Cloud Storage. Watch to see how object tables extend Google data cloud’s best practices of securing, sharing, and governing structured data to unstructured, without needing to learn or deploy new tools.

A Look Inside The Snowflake/Microsoft Partnership

How do two companies manage a relationship that is both competitive and supportive? Snowflake and Microsoft fall into that category. Take a peek behind the curtain of this unique partnership as “Data Cloud Now” host Ryan Green sits down with John Sapone of Snowflake and Tyler Bryson of Microsoft for an enlightening discussion.

Snowflake's Commitment to Continuously Improve Economics for Our Customers

Since Snowflake’s inception, we’ve had the needs of our customers as our North Star, with a clear focus on security and governance of data. Early on we also committed to continuous innovations to improve performance and reduce latencies, and by virtue of our business model continuously improve the economics for our customers.

How data agility propels the world's online marketplace eBay

In today’s economy, every business is terrified of failure. They need to beat the competition to win. The most impactful way to do that is by using data as a strategic asset. Learn how eBay, the world’s leading online marketplace, optimized their data services to meet critical business objectives. Dive into how they experienced zero downtime while migrating their solution and built a framework that now enables the organization with a plug-and-play model that moves at the speed of business.

Achieve Insightful Operational Reporting for Oracle ERPs

Your business needs actionable insights from your Oracle ERP data to respond to volatile market conditions and outpace your competition. But generating custom reports requires deep technical knowledge and the process is often managed by IT. The process can often take weeks, if not months, and, in many cases, the report or dashboard is limited to a single use case and applicable only to a single business unit or user – often only the requester.

Episode 6 | Data Analytics | 7 Challenges of Big Data Analytics

The first 5 challenges of #bigdataanalytics have been solved, bringing us closer to the end of the #datajourney. And here is where it starts getting real: Data Analytics. Today, there are struggles between operational and business analysis departments. SQL and ML functionality natively without data movement or duplication. How can you access and share the data timely, and efficiently, without data movement or duplication or an insane cost increase? Thomas Hazel shares his insights on how any organization can overcome this challenge, easily.

How to customize color ranges in charts

Learn how to change the color settings in a chart by using different metric ranges, and even custom sets of colors. You'll learn how manipulate the color in your charts by setting upper and lower bounds, or by specifying the midpoint in your chart. You will also learn the two places that you can access the color settings in your charts.

Our reflections on the 2022 Gartner Magic Quadrant for Data Quality Solutions

In its 2022 Magic Quadrant™ for Data Quality Solutions report, Gartner® emphasizes the “importance of having a critical approach toward managing the health and fitness of data.” We agree that “poor data quality completely destroys business value.” Organizations simply cannot afford for data quality to be an afterthought. And if you’re currently in the process of evaluating data solutions for your organization, this report can help.

Hevo vs Airbyte vs Integrate.io: An ETL Tool Comparison

In the competitive market of ETL solutions, platforms like Hevo, Airbyte, and Integrate.io are amongst the top contenders. While they all are ETL/ELT/Reverse ETL platforms, each has its unique set of features to offer. The best ETL tool for your business is the one that best fits in your modern data stack and is aligned with your unique requirements. So how do you decide which tool meets your business needs?

Create Beautiful Business Insights With Yellowfin Using Data from APILayer

Yellowfin analytics has a broad range of capabilities to help enterprise organizations and product owners solve the most pressing analytical dashboards and reporting needs. If you've been using Yellowfin for a while, you know how great it is to tell stories with data, work together, and make beautiful, easy-to-use dashboards that let more people see, understand, and act on their data.

Is self-service BI attainable? Benefits and historical concerns of self-service BI

Whether you call it self-service analytics or self-service business intelligence (BI), there has been much discussion about the perils, myths, promises, and prospects of successfully building self-service capability. Going forward, I’ll use the phrase “self-service BI” but you are welcome to substitute the words “self-service analytics”.So, is self-service BI actually attainable or just snake oil?

Ozone Write Pipeline V2 with Ratis Streaming

Cloudera has been working on Apache Ozone, an open-source project to develop a highly scalable, highly available, strongly consistent distributed object store. Ozone is able to scale to billions of objects and hundreds petabytes of data. It enables cloud-native applications to store and process mass amounts of data in a hybrid multi-cloud environment and on premises.

How Doordash Brings Data Analytics to Everyone With Snowflake And Alteryx

In this interview with Ryan Green on “Data Cloud Now,” Adam Wilson and Nitin Brahmankar of Alteryx outline their company’s mission to move data from the realm of the highly technical and into the hands of the people in in organizations who actually use the data to unlock insights, manage risk, cultivate markets, and more. It’s a mission, they says, that is helped along by the ecosystem of Snowflake users who work as a community to solve customer use cases. #datacloudnow #datacloudworldtour

SaaS in 60 - Qlik Sense SaaS - Monitoring Apps

Did you know that Qlik offers a variety of monitoring applications that can provide various insights on your Qlik Cloud environment? If you want to track usage capacity of users on your tenant, check out the Entitlement Analyzer. Need to optimize your Qlik Sense applications? Then perhaps the App Analyzer will help. Want more insight into your app reloads – download the Reload Analyzer.

Episode 5 | Data Platform | Data Journey | 7 Challenges of Big Data Analytics

What are data platforms? A data platform (or more topical, “cloud data platform”) is an integrated set of technologies that collectively meet an organization’s end-to-end data needs. In totality, it enables the storage, delivery, and governance of company data, as well as a security layer for users and applications. The heart of a platform is an actual database where it might be better called a data “analytics” platform or in our case big data analytics platform. Learn more about data platforms and how the ChaosSearch platform solves the challenges faced in big data analytics.

What is a data catalog?

Metadata is data about data. Think of names, creation dates, and any other contextual information that describes the data in your data lake or data warehouse. All this metadata adds meaningful information to your datasets. This improves the data’s usability and makes data a real asset for your organization. A catalog of all the metadata makes search and retrieval of any data possible.

Jet Reports for Microsoft Dynamics 365 Finance and Supply Chain Management Overview

Jet Reports, from insightsoftware is a fast, flexible, financial and business reporting solution inside excel that allows finance and business users to easily get all of the data they need in the format they want. Jet Reports easy-to-configure reports and pre-built templates allow users to eliminate the need for manual data dumps and IT dependency, allowing fast and easy access to data from your Microsoft Dynamics database in a format that works for you.

Snowpark for Python: Large-Scale Feature Engineering, Machine Learning Model Training, and More

As data science and machine learning adoption has grown over the last few years, Python is catching up to SQL in popularity within the world of data processing. SQL and Python are both powerful on their own, but their value in modern analytics is highest when they work together.

Developers Rejoice! Snowflake Is All in on Python, Pipelines, and Apps

Snowflake is committed to helping developers focus on building their apps and businesses rather than on infrastructure management. At this year’s Snowday, Snowflake announced a series of advancements that empower developers to do more with their data, enhancing productivity and unlocking new ways to develop applications, pipelines, and machine learning (ML) models with Snowflake’s unified data platform.

Driving Business Value from a Data Mesh Approach

Irrespective of what it’s called, the market has talked about what amounts to data mesh for several years. The concept of decentralized data management that is driven by business domains helps support the need for business-focused data outcomes. It also helps place value on where the value of data projects should be - on business needs. Data driven organizations need to look at business domains as a way of organizing the various desired outcomes of analytics and data movement initiatives.

Drive Valuable Insights About Your Web3 Application Using API Analytics

What’s your API data telling you about your Web3 App? By lifting relevant information from your App’s API transactions and call logs, you can identify and proactively catch issues before they’re surfaced by your customers. Keep your customers happy and reduce churn - make your customer success team performant.

Using Snowpark For Python And XGBoost To Run 200 Forecasts In 10 Minutes

Snowpark for Python, now generally available, empowers the growing Python community of data scientists, data engineers, and developers to build secure and scalable data pipelines and machine learning (ML) workflows directly within Snowflake—taking advantage of Snowflake’s performance, elasticity, and security benefits, which are critical for production workloads. Using user-defined table functions (UDTFs) and the new Snowpark-optimized warehouse with higher memory, users can run large-scale model training workloads using popular open-source libraries available through Anaconda integration.

Winning the race: data as the ultimate competitive edge, with Susie Wolff

Susie Wolff, former Formula 1 driver and founder of Dare to be Different, knows a lot about using data to thrive under pressure. In racing, data is the difference between being a champion and falling behind. How can your business become data driven the way Formula 1 has? How can you get the insights you need to thrive — not tomorrow, not next week, but right now? Industry analyst and digital transformation expert Maribel Lopez interviews Wolff, extracting takeaways that every business can apply.

Modern Data Architectures | Data Mesh, Data Fabric, & Data Lakehouse

For years, companies have viewed data the wrong way. They see it as the byproduct of a business interaction and this data often ends up collecting dust in centralized silos governed by data teams who lack the expertize to understand its true value. Cloudera is ushering in a new era of data architecture by allowing experts to organize and manage their own data at the source. Data mesh brings all your domains together so each team can benefit from each other’s data.

Become a Financial Storyteller

Financial statements tell an important story, but they rarely tell the entire story. It often requires a sharp eye and a healthy measure of experience to elicit meaningful information from the numbers. Even people with keen financial acumen will have questions, and they can easily overlook important realities that lay buried somewhere within the details. For those with less experience reading financial reports, this task is far more difficult.

What is Data Security? - The Role of Analytics in Data Protection

Data security (or data protection) is a term often used in the context of analytics and business intelligence (BI). It encompasses a number of different policies, processes and technologies that protect an company's cyber assets against data breaches and threats. But what does all of that really mean, in relation to BI specifically?

Are These the 6 Best Reverse ETL Vendors?

The amount of big data that enterprises churn out is simply staggering. All this information is worthless unless organizations unlock its true value for analytics. This is where ETL proves useful. Traditional ETL (extract, transform, and load) remains the most popular method for moving data from point A to point Z. It takes disparate data sets from multiple sources, transforming that data to the correct format and loading it into a final destination like a data warehouse.

Bridging The Gap Between Legacy Security And Modern Threat Detection

In this episode of “Powered by Snowflake” host Daniel Myers sits down with Anvilogic’s Security Strategist and Head of Product Marketing, Jade Catalano. Founded by a team of security industry vets, Anvilogic has helped ease legacy security systems into the future, enabling organizations to quickly detect, hunt, and respond to threats. This conversation covers the challenges of modernizing legacy security systems, how to manage an excess of data, a product demo, and more.

Episode 3 & 4 | Data Destination & Data Governance | Data Journey

What are data destinations? In a very abstract sense, data destination is another input along the series of process elements in a data pipeline. However, when calling out an element as the destination, it is really seen as the final destination such as a database, data lake or data warehouse. And yet, any element within the data pipeline has aspects of a final destination (and scaling challenges).

Go Beyond Data Visualization to Data Storytelling

Stories are a fundamental component of effective human communication. According to a study conducted by Stanford University professor Chip Heath, 63% of people are likely to remember a story shared as part of a presentation. He also found that speakers who merely present facts and figures only achieve a 5% recall rate among their audience. Stories convey meaning and context in ways that facts and figures alone cannot.

How Elevate.inc Used Data Integration to Improve Customer Experience

Customer experience is one of the most critical concerns for any organization—but also one of the most challenging for companies to perform concrete improvements. When they better understand the customer experience, businesses can define a clear, actionable roadmap to optimize the customer journey. In turn, this will pay dividends in terms of greater employee productivity, lower costs, and higher profits.

Qlik Cloud Data Integration - World Class Data Movement and Transformation to Power the Enterprise Data Mesh

Today we announced the launch of Qlik Cloud Data Integration, our new Enterprise Integration Platform as a Service (eiPaaS) offering that is a significant expansion of our Qlik Cloud portfolio. It’s been quite a journey over the last few years that’s led to this moment.

BigQuery, Google's Enterprise Data Warehouse

The Google Cloud Computing Foundations courses are for individuals with little to no background or experience in cloud computing. They provide an overview of concepts central to cloud basics, big data, and machine learning, and where and how Google Cloud fits in. By the end of the series of courses, learners will be able to articulate these concepts and demonstrate some hands-on skills. Google Cloud Skills Boost provides real Google Cloud environments that help developers and IT professionals learn cloud platforms and software, such as Firebase, Kubernetes, and more.

Kensu partners with Collibra to automate data catalog completion

Kensu announces its partnership with Collibra, the Data Intelligence company, and the availability of an integration between the two solutions. Kensu's observability capacities will enrich Collibra's Catalog with clean, trustworthy, and curated information to enable business users and data scientists to make business decisions based on reliable data.

Partnering with AWS on Amazon HealthLake to Speed Insights

Gaps in patient healthcare, ranging from access and affordability, to those specific to race, gender, age and beyond, are widening across the US and leading to a variety of detrimental results for people, the healthcare system, and the economy itself. Such ongoing disparities are slowing the country’s ability to achieve population health and accounting for billions of dollars in unnecessary health care spending annually.

When Private Cloud is the Right Fit for Public Sector Missions

It’s no secret that IT modernization is a top priority for the US federal government. A quick trip in the congressional time machine to revisit 2017’s Modernizing Government Technology Act surfaces some of the most salient points regarding agencies’ challenges: In the private sector, excluding highly regulated industries like financial services, the migration to the public cloud was the answer to most IT modernization woes, especially those around data, analytics, and storage.

Growing With The Data Cloud - Lightfold's Story Of Startup & Success Through The Pandemic

Lightfold, a young analytics consultancy based in Brisbane, Australia, came into being in 2019, just before COVID hit, with the goal of helping companies gain mastery over their data by helping them build modern data stacks centered around Snowflake. In this episode of “Data Cloud Now,” Ryan Green, chats with three members of the Lightfold leadership team–CEO John Cosgrove, Data Evangelist Graeme Lewis, and Director, Design and Product, Kylie Willett–about their company’s origin story and about the way in which Snowflake’s technology has fundamentally changed the way organizations think about data.

Webinar Recording: Accelerating Cloud Data Modernization

As organizations seek to become more competitive, they are often looking to enrich their data sets for analytics to gain deeper insights. The data used for enrichment may include text data, machine data, image data, geospatial data, and real-time data. This data may be high volume, highly diverse, and disparate in nature. As part of this effort, organizations are moving to cloud data platforms to store and manage this modern data.

Webinar Recording: Powering Modern Applications: Data Management for Speed and Scale

Designed to be fast, scalable, flexible, and user-friendly, modern applications are at the center of the innovation and automation that is transforming companies, industries, and society today. At the same time, modern applications, increasingly built with microservices, also come with requirements that traditional data management approaches fall far short of effectively meeting.