In our recent webinar, we tackled a familiar frustration for many engineering teams: long build times. Our expert panel broke down the problem and explored how remote build caching can deliver faster builds, improved workflows, and real business value. Here are the five biggest takeaways.
BigQuery Data Engineering Agents are here to help data analysts and engineers build faster, and focus more on creative problem-solving. Lucia Subatin shows how these AI-powered agents can save your time from tedious coding, schema mapping, and manual metadata creation Speakers: Lucia Subatin Products Mentioned: AI Infrastructure, BigQuery.
Welcome back to our ongoing series on strategies for gaining an unfair advantage. So far, we’ve explored how leveraging the tech ecosystem, overcoming adversity, unlocking hidden value in data, fostering great partnerships, and balancing risk with reward can set your organization apart. Each topic has offered actionable insights and real-world examples to help you stay ahead.
Sometimes the biggest opportunities come disguised as unproven protocols released on a random Monday. Here’s why we bet on MCP before anyone asked us to. Two months before anyone knew what MCP was, we made a bet that it would fundamentally transform how people interact with their data infrastructure.
Have you checked out SmartBear's Server which opens the door to capabilities from across our API Hub, Test Hub, and Insight Hub right within your AI development environment. In this video, Yousif Ahmed fixes a real software issue using a combination of our Insight Hub MCP tools together with GitHub Co-pilot.
When we first started using Playwright for automated testing, the built-in test runner and test reports seemed fine. It showed passed tests and failures, which worked for small projects. But as our large test suites and CI/CD pipelines expanded, the default test runner reports became limiting especially when analyzing detailed results from each test run, the default test runner reports became limiting—especially when analyzing large-scale test results.
AI Needs All Your Data. Your ETL Vendor Is Charging You to Keep It Locked Away. In this post, we explore why legacy ETL pricing models are fundamentally misaligned with the demands of modern AI workloads.
In the last decade, AWS has redefined how businesses build data pipelines. Its ETL toolset isn’t just about moving datasets, it’s about orchestrating security, compliance, scale, and efficiency. Whether you're migrating legacy data systems or building modern ELT workflows, AWS offers a robust, versatile stack of services to meet virtually any requirement.
This tutorial demonstrates how to build a Fivetran Connector SDK custom connector using VS Code and GitHub Copilot. The demo showcases the end-to-end process of creating, testing, and deploying a connector that ingests tobacco problem reports from the openFDA API. What You'll Learn Resources.