Six Trends Driving Adoption of Lumada DataOps Suite

Innovative organizations need DataOps and new technologies because old-school data integration is no longer sufficient. The traditional approach creates monolithic, set-in-concrete data pipelines that can’t convert data into insights quickly enough to keep pace with business. The following trends are driving the adoption of Hitachi’s Lumada DataOps Suite.

How is logistics analytics driving business outcomes and growth?

Transportation and logistics companies generate and consume more data than almost every other industry. Despite this, they still find themselves lagging behind other B2B verticals in their ability to turn a profit from data. With thinning profit margins and new contenders entering the logistics industry, the only way to outperform other companies is through brain, not brawn. Logistics analytics offers the edge over the competition.

CDP Public Cloud: SSH Key Deployment

This video covers how to deploy SSH keys in CDP Public Cloud. It touches on how to generate a new SSH key pair and steps through the process of deploying it for a workload user through the Cloudera Management Console Web UI, as well as using the CDP command-line tool. It discusses the security implications of using the Cloudbreak user for login on data hub hosts, and explains why workload user credentials should be used instead in most cases. It also demonstrates using the deployed SSH keys for login to data hub hosts.

Policy-Driven Data Obfuscation: What, Why and How

How vulnerable is your sensitive data? Your data policies may put this information at risk of being breached. An ad hoc approach for dealing with this data makes it difficult to maintain your organization’s cybersecurity. Data obfuscation holds the key to improving your security and making it easier to use your data, but it must be driven by your policies to be effective.

What is No-Code?

Are you asking yourself the question “what is no-code”? You’re not alone. The concept sounds almost too good to be true: developing your own software applications without ever having to learn a programming language like Java or Python. Even your most technophobic employee can become a star software developer thanks to the proliferation of no-code development tools.

How to configure clients to connect to Apache Kafka Clusters securely - Part 4: TLS Client Authentication

In the previous posts in this series, we have discussed Kerberos, LDAP and PAM authentication for Kafka. In this post we will look into how to configure a Kafka cluster and client to use a TLS client authentication. The examples shown here will highlight the authentication-related properties in bold font to differentiate them from other required security properties, as in the example below. TLS is assumed to be enabled for the Apache Kafka cluster, as it should be for every secure cluster.

What Are the Best Integrators for Heroku?

If you're a developer trying to ETL data into and out of Heroku, the seemingly shortlist of options may disappoint you. Heroku itself promotes Heroku Connect, but this expensive solution might not even integrate with all the systems you use (like AdWords and Facebook), making it difficult to get a holistic view of your data. Fortunately, Heroku Connect isn't the only solution. In fact, there are several third-party ETL tools that can help you get your data in and out of Heroku with ease.