ChatGPT sparks the imagination and highlights the importance of Generative AI and foundation models as the basis for modern AI applications. However, this also brings a new set of AI operationalization challenges. Challenges like handling massive amounts of data, large scale computation and memory, complex pipelines, transfer learning, extensive testing, monitoring, and so on. In this talk, we explore the new technologies and share MLOps orchestration best practices that will enable you to automate the continuous integration and deployment (CI/CD) of foundation models and transformers, along with the application logic, in production.