Systems | Development | Analytics | API | Testing

Talend

How to Develop a Data Processing Job Using Apache Beam - Streaming Pipelines

In our last blog, we talked about developing data processing jobs using Apache Beam. This time we are going to talk about one of the most demanded things in modern Big Data world nowadays – processing of Streaming data. The principal difference between Batch and Streaming is the type of input data source. When your data set is limited (even if it’s huge in terms of size) and it is not being updated along the time of processing, then you would likely use a batching pipeline.

Talend and Splunk: Aggregate, Analyze and Get Answers from Your Data Integration Jobs

Log management solutions play a crucial role in an enterprise's layered security framework— without them, firms have little visibility into the actions and events occurring inside their infrastructures that could either lead to data breaches or signify a security compromise in progress. Splunk is the “Google for log files” heavyset enterprise tool that was the first log analysis software and has been the market leader ever since.

Talend & Apache Spark: Debugging & Logging

So far, our journey on using Apache Spark with Talend has been a fun and exciting one. The first three posts on my series provided an overview of how Talend works with Apache Spark, some similarities between Talend and Spark Submit, the configuration options available for Spark jobs in Talend and how to tune Spark jobs for performance. If you haven’t already read them you should do so before getting started here.

[Step-by-step] Using Talend for cloud-to-cloud deployments and faster analytics in Snowflake

For the past two years, Snowflake and Talend have joined forces developing deep integration capabilities and high-performance connectors so that companies can easily move legacy on-premises data to a built-for-the-cloud data warehouse. Snowflake, which runs on Amazon Web Services (AWS), is a modern data-warehouse-as-a-service built from the ground up for the cloud, for all an enterprise’s data, and all their users.

How to containerize your integration jobs with one click with Talend and Docker

Talend Data Integration is an enterprise data integration platform that provides visual design while generating simple Java. This lightweight, modular design approach is a great fit for containers. In this blog post, we’ll walk you through how to containerize your Talend job with a single click. All of the code examples in this post can be found on our Talend Job2Docker Git repository. The git readme also includes step-by-step instructions.