Systems | Development | Analytics | API | Testing

Commerzbank | Unleashing Hidden Data Treasures for Customers

Like many financial institutions, Commerzbank was challenged with staying flexible to meet customer needs, while also meeting regulatory compliance. In this Movers & Makers, Justyna Lebedyk, Product Owner in Big Data for Commerzbank, talks about how their digital transformation with the hybrid cloud and Cloudera allowed them to overcome this challenge.

Building and Managing the Modern Datastore: The Data Lakehouse

The 'data lakehouse' is quickly becoming popular in the data analytics community. Data lakehouse architecture combines the benefits of a data warehouse and a data lake. It aims to merge the data warehouse’s data structure and management features along with the flexibility and relatively low cost of the data lake. Watch this panel discussion to learn how the data lakehouse can address the limitations of the data lake and data warehouse architecture to deliver significant value for organizations. Explore why the data lakehouse is an ideal option for enterprise data storage initiatives.

MongoDB vs. PostgreSQL: Detailed Comparison of Database Structures

One of the most important parts of the function of any company is a secure database. With phishing attacks, malware, and other threats on the rise, it is essential that you make the right choice in order to keep your data safe and process it effectively. However, it can be extremely difficult to choose among the wide variety of database solutions on the market today. Two commonly-used options are Mongodb and Postgresql. What do you need to know about MongoDB vs. PostgreSQL?

Unstructured Data Now Generally Available in Snowflake, Processing with Snowpark in Public Preview

We’re excited to announce the general availability of the unstructured data management functionality in Snowflake. We launched public preview of this functionality in September 2021, and since then we have seen adoption by customers across industries for a variety of use cases. These use cases include storing and securing call center recordings, securely sharing PDF documents in Snowflake Data Marketplace, storing medical images and extracting data from them, and many more.

Why You Need a Fully Automated Data Pipeline

The five main reasons to implement a fully automated data pipeline are: When you think about the core technologies that give companies a competitive edge, a fully automated data pipeline may not be the first thing that leaps to mind. But to unlock the full power of your data universe and turn it into business intelligence and real-time insights, you need to gain full control and visibility over your data at all its sources and destinations.

Business Intelligence on the Cloud Data Platform: Approaches to Schemas

The cloud data platform combines data warehouse and data lake capabilities to support the exploding world of analytics. Like a data warehouse, the cloud data platform structures, transforms, and queries data. Like a data lake, it classifies multi-structured data objects in an elastic object store. The cloud data platform provides an ideal launchpad for modern business intelligence (BI) projects that need fast, flexible access to lots of varied data. As you might expect, this is a tall order to fill.

Automatic data risk management for BigQuery using DLP

Protecting sensitive data and preventing unintended data exposure is critical for businesses. However, many organizations lack the tools to stay on top of where sensitive data resides across their enterprise. It’s particularly concerning when sensitive data shows up in unexpected places – for example, in logs that services generate, when customers inadvertently send it in a customer support chat, or when managing unstructured analytical workloads.