Systems | Development | Analytics | API | Testing

Machine Learning

Explaining machine learning models to business users using BigQueryML and Looker

Organizations increasingly turn to AI to transform work processes, but this rapid adoption of models has amplified the need for explainable AI. Explaining AI helps us understand how and why models make predictions. For example, a financial institution might wish to use an AI model to automatically flag credit card transactions for fraudulent activity. While an accurate fraud model would be a first step, accuracy alone isn’t sufficient.

Growing AI Fast with ML-Ops: Breaking the barrier between research and production

AI models get smarter, more accurate, and therefore more useful over the course of their training on large datasets that have been painstakingly curated, often over a period of years. But in real-world applications, datasets start small. To design a new drug, for instance, researchers start by testing a compound and need to use the power of AI to predict the best possible permutation.

Interview with Machine Learning Engineer Semih Cantürk

In the latest instalment of our interviews speaking to leaders throughout the world of tech, we’ve welcomed Semih Cantürk. Semih is a Machine Learning Engineer at Zetane Systems and an MSc & incoming PhD student at the University of Montréal and MILA Institute. At Zetane, he’s responsible for the development and integration of explainable AI algorithms in addition to leading various project work.

ODSC Webinar: Git Based CI/CD for ML

In this session, Yaron Haviv, Iguazio's Co-Founder and CTO, discussed how to enable continuous delivery of machine learning to production using Git-based ML pipelines (Github Actions) with hosted training and model serving environments. He touched upon how to leverage Git to solve rigorous MLOps needs: automating workflows, reviewing models, storing versioned models as artifacts, and running CI/CD for ML. He also covered how to enable controlled collaboration across ML teams using Git review processes and how to implement an MLOps solution based on available open-source tools and hosted ML platforms. The session includes a live demo.

The Complete Guide to Using the Iguazio Feature Store with Azure ML - Part 1

In this series of blog posts, we will showcase an end-to-end hybrid cloud ML workflow using the Iguazio MLOps Platform & Feature Store combined with Azure ML. This blog will be more of an overview of the solution and the types of problems it solves, while the next parts will be a technical deep dive into each step of the process.

Building an MLOps infrastructure on OpenShift

Most data science projects don’t pass the PoC phase and hence never generate any business value. In 2019, Gartner estimated that “through 2022, only 20% of analytic insights will deliver business outcomes”. One of the main reasons for this is undoubtedly that data scientists often lack a clear vision of how to deploy their solutions into production, how to integrate them with existing systems and workflows and how to operate and maintain them.

Looking into 2022: Predictions for a New Year in MLOps

In an era where the passage of time seems to have changed somehow, it definitely feels strange to already be reflecting on another year gone by. It’s a cliche for a reason–the world definitely feels like it’s moving faster than ever, and in some completely unexpected directions. Sometimes it feels like we’re living in a time lapse when I consider the pace of technological progress I’ve witnessed in just a year.