Systems | Development | Analytics | API | Testing

Performance Testing, Artificial Intelligence and Machine Learning

We are going to look at how performance testing can work hand in hand with Artificial Intelligence and Machine Learning: As there are many Artificial Intelligence solutions to choose from and for the purposes of this post this is the easiest way to discuss the principles of Artificial Intelligence working with performance testing rather than discussing a particular framework.

Zero Trust and the Appian Platform's Adoption Strategy

The term “Zero Trust” has become one of the most important concepts in the information security industry. An all-encompassing phrase for many modern security best practices, Zero Trust is a conceptual design philosophy focused on continuous authentication and authorization for each action a user takes within a session rather than verification that only occurs at the start of a session.

An Overview of Streaming Analytics in AWS for Logging Applications

Streaming analytics in AWS gives enterprises the ability to process and analyze log data in real time, enabling use cases that range from delivering personalized customer experiences to anomaly and fraud detection, application troubleshooting, and user behavior analysis. In the past, real-time log analytics solutions could process just a few thousand records per second and it would still take minutes or hours to process the data and get answers.

Streamline Mobile App Management with Sauce Labs

In this blog, learn how to manage your mobile apps all in one place for a more connected and efficient testing experience with Sauce Labs' new App Management dashboard. Ensuring a seamless user experience for your mobile app requires an increasing amount of testing. With customer expectations also increasing, It’s never been more important for mobile development teams to have a highly scalable and reliable testing solution in their tech stack.

Maximize Business Results with FinOps

As organizations run more data applications and pipelines in the cloud, they look for ways to avoid the hidden costs of cloud adoption and migration. Teams seek to maximize business results through cost visibility, forecast accuracy, and financial predictability. Watch the breakout session video from Data Teams Summit and see how organizations apply agile and lean principles using the FinOps framework to boost efficiency, productivity, and innovation. Transcript available below.

Enabling Strong Engineering Practices at Maersk

As DataOps moves along the maturity curve, many organizations are deciphering how to best balance the success of running critical jobs with optimized time and cost governance. Watch the fireside chat from Data Teams Summit where Mark Sear, Head of Data Platform Optimization for Maersk, shares how his team is driving towards enabling strong engineering practices, design tenets, and culture at one of the largest shipping and logistics companies in the world.

Spark Technical Debt Deep Dive

Once in a while I stumble upon Spark code that looks like it has been written by a Java developer and it never fails to make me wince because it is a missed opportunity to write elegant and efficient code: it is verbose, difficult to read, and full of distributed processing anti-patterns. One such occurrence happened a few weeks ago when one of my colleagues was trying to make some churn analysis code downloaded from GitHub work.

Puppeteer in Node.js: Common Mistakes to Avoid

Puppeteer is a powerful Node.js browser automation library for integration testing and web scraping. However, like any complex software, it comes with plenty of potential pitfalls. In this article, I'll discuss a variety of common Puppeteer mistakes I've encountered in personal and consulting projects, as well as when monitoring the Puppeteer tag on Stack Overflow.

Revamping Data Management Strategies with Data Pipelines

1. Data pipelines can improve data management strategies by enabling quick and easy data flow, transformation, and analysis. 2. Considerations when building a data pipeline include real-time data ingestion, scalability, performance optimization, data security and governance, and support for multiple sources. 3. Data mesh is a decentralized data architecture that organizes data sources by their specific business domains and must comply with the principles of the architecture. 5.