Systems | Development | Analytics | API | Testing

A single-click Kafka topic backup experience

We like to reduce the most mundane, complex and time-consuming work associated with managing a Kafka platform. One such task is backing up topic data. With a growing reliance on Kafka for various workloads, having a solid backup strategy is not just a nice-to-have, but a necessity. If you haven’t backed up your Kafka and you live in fear of disaster striking, worry no more.

Lenses 5.3: Robust Kafka with single click topic backup/restore

Navigating the intricacies of Apache Kafka just got a lot more intuitive. With Lenses 5.3 we bring you peace of mind, regardless of where you are in your Kafka journey. Our newest release is all about smoothing out the bumps, and making sure you're equipped to handle Kafka's challenges with confidence. Here's a sprinkle of what's in store, ahead of our big 6.0 release later this year.

How to Use SFTP to Securely Transfer Files

Transferring files securely between machines is a fundamental part of the ETL (Extract, Transform, Load) process, which involves extracting data from one source, transforming it for analysis, and loading it into a data warehouse. The challenge? Ensuring these files are both sent and received without interception by malicious entities. For years, FTP (File Transfer Protocol) served as the go-to method to transfer files between a client and server on a network.

From Analytics to Outreach

Not all heroes in the tech world write code. Some wield the power of data analytics and SEO to create compelling stories and foster brand growth. This week, our Monday Member Spotlight features Jose, TestQuality’s Marketing Assistant with years of specialized experience in Google Analytics and SEO. Let's explore how he takes a data-driven approach to spread the word about TestQuality.

Driving Data Discovery and Reliability for Better Business Decision Making

Enterprises are drowning in data. Structured, semi-structured or unstructured data for the modern, data-driven enterprise is everything, everywhere, all at once. But that’s also a challenge for enterprises looking to transform their data into usable information for business success. The sheer volume of data is challenging the ability of enterprises to find trustworthy, reliable data to drive their business decisions. Traditional data catalogs offer only structured data discovery.

How to Increase Data Processing: Combining SFTP and Heroku

Secure File Transfer Protocol (SFTP), at its core, is a protocol designed to provide secure file transfer capabilities. With an extensive application in web development and IT infrastructures, its primary use case revolves around the encrypted transfer of files between remote servers and local machines.

Use GitOps as an efficient CI/CD pipeline for Data Streaming | Data Streaming Systems

Early automation saves time and money. GitOps improves CI/CD pipeline, enhancing operations & traceability. Learn to use GitOps for data streaming platforms & streaming applications with Apache Kafka and Confluent Cloud.

Robust Disaster Recovery with Kafka and Confluent Cloud | Data Streaming Systems

Explore the resilience of Kafka, understand the implications of datacenter disruptions, and mitigate data loss impacts. Learn to scale with Confluent Cloud, cluster and schema linking, and how to use an active/passive disaster recovery pattern for business continuity.