Systems | Development | Analytics | API | Testing

Dodge the thundering herd with file-based OPcache

In the blog post about Fine-Tuning OPcache Configuration I mentioned the thundering herd problem that affects OPcache during cache restarts. When OPcache is restarted, either automatically or manually, all current users will attempt to regenerate the cache entries. Under load this can lead to a burst in CPU usage and significantly slower requests.

Data and AI Trends 2026: Predictions for Agentic AI Production

Agentic AI is moving quickly from experiments to real work. In 2026, it shows up inside the workflows that drive outcomes: decisions, operations, and accountability. In the season 7 premiere of the Data Chief podcast, host Cindi Howson sat down with three leaders who work at the intersection of AI ambition and enterprise execution: Paul Baier (GAI Insights), Jennifer Belissent (Snowflake), and Rory Blundell (Gravitee).

Build vs. Buy: Why Embedded Analytics is the Strategic Choice for Modern Data Leaders

For today’s CTOs and CIOs, the pressure to deliver actionable data insights within your products has never been higher. However, a critical dilemma often stalls your progress toward the business intelligence tools you need for the task: Should your engineering team build a bespoke analytics engine from scratch, or should you integrate a professional embedded solution?

IDP vs. OCR: Evolving Approaches to Document Processing

Reading emails, scanning contracts, manually processing invoices—the tedious tasks related to document processing can jam up your business operations. Document processing is a prime candidate for automation, but the technology is advancing so fast, it can be hard to know where to start or when it is time to modernize. OCR (Optical Character Recognition) and IDP (Intelligent Document Processing) are two approaches to tackling business documents.

How to Connect LLM Chat and AI Agents to Enterprise Data Using Built-In MCP in DreamFactory

TL;DR: DreamFactory 7.4+ includes a built-in MCP (Model Context Protocol) server that lets you connect any LLM—ChatGPT, Claude, Perplexity, or custom AI agents—to your enterprise databases through governed, role-based APIs. Setup takes minutes: create an MCP service in the admin console, copy the OAuth credentials, and point your AI application to the generated endpoint.

The Future of AI in the Enterprise

As AI continues to rise in importance across all industries, the cost of implementation, readily available access to cloud computing, and practical business use cases make AI-powered offerings a competitive advantage for product managers, engineering, and data leaders. However, AI isn’t without its fair share of risks and challenges.

A Memory-centric Approach to System Strategy: 6 Takeaways from Supercomputing 2025

Artificial intelligence workloads are reshaping how memory is produced, priced, and prioritized. Not because the supply chain has fundamentally broken, but because manufacturers are making deliberate decisions about where to place capacity and capital. Wafer lines are being steered toward high-margin, long-term AI demand, not toward broad, undifferentiated expansion. HBM, advanced DRAM, and other AI-optimized memory now command the majority of investment and forward planning.

Supermetrics MCP vs. Databox MCP: Choosing Between Data Pipeline and Analytics Platform

If you’re evaluating MCP servers for your analytics stack, you’ve probably noticed that “MCP support” can mean very different things depending on the vendor. I’ve been working with both platforms, and the distinction matters more than most comparison articles let on. Supermetrics and Databox both offer MCP implementations, but they’re built for different jobs.