Systems | Development | Analytics | API | Testing

Running Kafka in Kubernetes: What We Learned

Apache Kafka is mission-critical for many organizations—but where you deploy it matters just as much as how you use it. In this video, two OpenLogic experts discuss why they increasingly encourage customers to move their Kafka clusters to Kubernetes and utilize the Strimzi operator, and what that shift unlocks from an operational, scalability, and resilience standpoint.

Common Kafka Anti Patterns and How to Avoid Them

Kafka is powerful—but common Kafka mistakes can quietly undermine performance, reliability, and scalability. In this video, two OpenLogic experts break down the most frequent Kafka anti-patterns they see in real customer environments—and how to avoid them. Based on hands-on experience fixing production Kafka clusters, this discussion covers: If you’re running Apache Kafka in production—or planning to—this video will help you spot Kafka mistakes early and apply proven best practices to build a more stable, scalable event streaming platform.

What is an MCP for Kafka with Tun Shwe

AI agents are only as good as the data they can access. In this video, we explore the Model Context Protocol (MCP) and how it creates a bridge between AI models and Apache Kafka. Learn how MCP allows AI agents to securely produce, consume, and manage Kafka topics in real-time—transforming your event streams into actionable context for LLMs.

How to Build a Custom Kafka Connector - A Comprehensive Guide

In today’s data-driven world, seamless data integration is crucial to ensuring the smooth operation of modern systems. With the growing complexity of distributed data platforms, businesses and developers are seeking efficient ways to move, process, and transform data. Apache Kafka has become the de facto standard for real-time data streaming, and Kafka Connect plays a key role in facilitating the integration of Kafka with various data sources and sinks.

Confluent Connect: FY'25 Launch Highlights - Unlocking Data & Powering AI Pipelines

Dive into the biggest breakthroughs for the Confluent Connect ecosystem in 2025! This year, we made moving data easier than ever, from modernizing legacy systems with the Oracle XStream CDC Premium Connector to empowering developers with Custom SMTs and Custom Connectors on Google Cloud. Discover the over 10 new connectors we launched, including Snowflake Source, Azure Cosmos DB v2, and Neo4j Sink, plus the release of Confluent Hub 2.0. Learn how Confluent Cloud connectors are breaking down silos and building bridges for your next-gen AI and data modernization projects.

Why Managing Your Apache Kafka Schemas Is Costing You More Than You Think

For developers building event-driven systems, schemas are essential for using schemas to define data contracts between producers and consumers in Apache Kafka, ensuring every message can be correctly interpreted. But when schema management is handled manually or through do-it-yourself (DIY) solutions, organizations face escalating expenses that compound as their deployments scale.

Load Testing Kafka #speedscale #kafka #loadtesting

Message brokers are a critical component of modern distributed systems, facilitating asynchronous communication between services. Load testing message broker integrations requires special considerations since the interaction patterns differ from traditional HTTP-based APIs. Speedscale provides specialized tooling to help you load test applications that integrate with message brokers by.