Systems | Development | Analytics | API | Testing

Beyond Zero-Ops: Architectural Precision for MongoDB Atlas Connectors

Whether you’re streaming change data capture (CDC) events from MongoDB to Apache Kafka or sinking high-velocity data from Kafka into MongoDB for analytics, the following best practices ensure a secure, performant, and resilient architecture. This technical deep dive covers implementing the MongoDB Atlas Source and Sink Connectors on Confluent Cloud.

How to Future-Proof Architectures With Continuous Availability Via Hybrid & Multicloud

When designing on-premises and cloud systems, you have to balance resilience, security, and scalability. But ultimately, what your organization and business leaders care about is the bottom-line: today’s costs and tomorrow’s risk. As a result, hybrid and multicloud strategies are often viewed as simply a backup or disaster recovery strategy, instead of a path to availability your applications and business operations can really count on.

Do Customers Really Care If You Love Them?

Customers don’t buy software because they feel loved. They buy it because the product works, solves a real problem, meets security, scalability, and reliability requirements, and fits their budget. No amount of empathy or friendliness can compensate for missing features or poor performance. So at first glance, it’s easy to assume that great products alone win customer loyalty. But once the contract is signed and the product is in use, the rules change.

Disaster Recovery in 60 Seconds: A POC for Seamless Client Failover on Confluent Cloud

I’ve worked with Apache Kafka since 2019, and deciding how to design and implement client failover was a sticking point in almost every use case I dealt with. Even for Confluent customers—who have the benefit of features such as Confluent Replicator, Multi-Region Clusters, and Cluster Linking—ensuring seamless failover between Kafka environments is a challenging problem.

Focal Systems: Boosting Store Performance with an AI Retail Operating System and Real-Time Data

Every second a product sits out-of-stock on a shelf, revenue quietly drains away. Customers walk out empty-handed, and businesses lose customers as well as valuable insights into what’s actually happening on the floor. At Focal Systems, we have built Shelf AI, a system that continuously “sees” store shelves, detects stockouts, and guides teams to replenish in time—so products are on the shelf when customers need them.

Streaming Data Integration with Apache Kafka

Data streaming with events supports many different applications and use cases. Event-driven microservices use data streaming, allowing companies to build applications based on domain-driven designs. This approach allows teams to break applications into composable microservices that can be worked on independently, speeding development. These designs scale well and can process huge amounts of data efficiently.

Cloud API Keys vs Resource-Specific API Keys in Confluent Cloud

As you build and manage data streams in Confluent Cloud, securing your interactions with its APIs is paramount. Confluent Cloud offers two types of API keys that manage authentication to the different APIs in Confluent Cloud: cloud API keys and resource-specific API keys. Each has its own distinct characteristics and use cases.

Empowering Customers: The Role of Confluent's Trust Center

The foundation of every successful customer relationship is trust. At Confluent, we understand that for our customers and prospects to innovate with confidence, they must have complete trust in the security and integrity of our platform. Our commitment goes beyond simply providing a secure product. It’s about empowering our customers with the tools and transparency they need to feel confident in their data streaming architectures.

Thunai Automates Customer Support with AI Agents and Data Streaming

Support teams live in a world of repetitive questions, fragmented tools, and growing customer expectations. Customer service agents bounce between customer relationship management (CRM) systems, ticketing, email, and chat while customers wait, often repeating the same information across channels. Batch-based systems are unscalable for AI: Context is always a step behind, escalations pile up, and it’s difficult to intervene in time.

Starting With Purpose: In-Person Onboarding in a Remote-First World

The hardest part about remote work is building real connection and purpose when everyone is not in the same room. At Confluent, we know flexibility is essential, but we also know that great work and a sense of belonging don’t just happen; they take effort. That’s why we’re intentional about how we bring people together, starting from day one.