Accelerating industrialization of Machine Learning at BMW Group using the Machine Learning Operations (MLOps … – AWS Blog

The BMW Group and Amazon Web Services (AWS) announced a strategic collaboration in 2020. The goal of that collaboration is to help further accelerate the BMW Groups pace of innovation by placing data and analytics at the center of its decision-making.

The BMW Groups Cloud Data Hub (CDH) manages company-wide data and data solutions on AWS. The CDH provides BMW Analysts and Data Scientists with access to data that helps drive business value through Data Analytics and Machine Learning (ML). As a part of BMWs larger strategy to leverage the availability of data within CDH and to help accelerate the industrialization of Machine Learning, the BMW Group worked closely with AWS Professional Services to develop their Machine Learning Operations (MLOps) solution.

The BMW Groups MLOps solution includes (1) Reference architecture, (2) Reusable Infrastructure as Code (IaC) modules that use Amazon SageMaker and Analytics services, (3) ML workflows using AWS Step Functions, and (4) Deployable MLOps template that covers the ML lifecycle from data ingestion to inference.

The MLOps solution supported the BMW Group in accelerating their industrialization of AI/ML use cases, resulting in significant value generation within the first two years after the solutions release. The long-term goal of the BMWs MLOps solution team is to help accelerate industrialization of over 80% of the AI/ML use cases at the BMW Group, helping to enable continuous innovation and improvement in AI/ML at the BMW Group.

Starting in 2022, the MLOps solution has been rolled out to AI/ML use cases at the BMW Group. It has seen widespread adoption and recognition as the BMW internal master solution for MLOps.

In this blog, we talk about BMW Groups MLOps solution, its reference architecture, high-level technical details, and benefits to the AI/ML use cases who develop and productionize ML models using the MLOps solution.

The MLOps solution has been developed to address the requirements of AI/ML use cases at the BMW Group. This includes integration with the BMW data lake, such as CDH, as well as ML workflow orchestration, data and model lineage, and governance requirements such as compliance, networking, and data protection.

AWS Professional Services and the MLOps solution team from the BMW Group collaborated closely with various AI/ML use cases to discover successful patterns and practices. This enabled the AWS and the BMW Groups MLOps solution team to gain a comprehensive understanding of the technology stack, as well as the complexities involved in productionizing AI/ML use cases.

To meet the BMW Groups AI/ML use case requirements, the team worked backwards and developed the MLOps solution architecture as mentioned in Figure 1 below.

Figure 1: MLOps Solution Architecture

In the section below, we explain the details of each component of the MLOps solution as represented in the MLOps solution architecture.

The MLOps template is a composition of IaC modules and ML workflows built using AWS managed services with a serverless first strategy designed to allow the BMW Group to use the scalability, reduced maintenance costs, and agility of ML on AWS. The template will be deployed to the AWS account of the AI/ML use cases to create an end-to-end, deployable ML and infrastructure pipeline. This is designed to act as a starting point for building AI/ML use cases at the BMW Group.

The MLOps template offers functional capabilities for the BMW Groups Data Scientists and ML Engineers ranging from data import, exploration, training, to deployment of ML model for inference. It supports applications in the operations of AI/ML use cases at the BMW Group by offering version control and infrastructure and ML monitoring capabilities.

The MLOps solution is designed to offer functional and infrastructure capabilities for use cases as independent building blocks. These capabilities can be used by AI/ML use cases as a whole or can choose selected blocks to help the BMW Group to meet their business goals.

Below is the overview of the MLOps Template building blocks offered by the BMW Groups MLOps Solution:

Figure 2: MLOps Template building blocks

MLOps solution offers Data Scientists and ML Engineers at the BMW Group with example notebooks to help enhance the learning curve of the BMW Groups Data Scientists and ML Engineers with AWS Services. These example notebooks include:

The MLOps solutions training pipeline developed using AWS Step Functions Data Science Python SDK, consists of required steps to train ML models, including data loading, feature engineering, model training, evaluation, and model monitoring.

Use case teams at the BMW Group have the flexibility to modify or expand the MLOps solutions Training pipeline as required for their specific projects. Common customizations thus far have included parallel model training, simultaneous experiments, pre-production approval workflows, and monitoring and alert notifications via Amazon SNS integration.

The details of MLOps solutions training pipeline steps are shown in Figure 3 below:

Figure 3: Training Pipeline

MLOps solution employs AWS CodePipeline to facilitate continuous integration and deployment workflows. The AWS CodePipeline sourcing steps allow users at the BMW Group to select their preferred source control, such as AWS CodeCommit or GitHub Enterprise.

AI/ML use case teams at the BMW Group can use AWS CodePipeline to help deploy the ML training pipeline, and thereby bootstrapping the required AWS infrastructure for orchestrating the ML training pipeline from reading data from the BMW Group data lake e.g., CDH to model training, evaluation, and ML model registration.

When the model training pipeline completes with registering the ML model in the Amazon SageMaker Model registry, the MLOps Solution uses Amazon EventBridge notifications to trigger AWS CodePipeline to deploy the inference module.

Around 80% of AI/ML use cases at the BMW Group served by the MLOps solution require high-performance and high-throughput methods for transforming raw data and generating inference from them. To meet the use case needs, the MLOps solution offers a batch inference pipeline with the required steps for those users at the BMW Group to load and pre-process the raw data, generate predictions, and monitor the predicted results for quality and offer explainability.

Along with the batch inference pipeline, the AI/ML use case teams at the BMW Group are provided with the required modules to help set up real-time inference in case they require low latency predictions and API integration with external use case applications.

The details of MLOps solutions batch inference pipeline steps are shown in Figure 4 below:

Figure 4: Inference Pipeline

The MLOps solution offers AI/ML use cases of the BMW Group to bring their own application stack in addition to the set of modules offered as a part of the MLOps solution. This helps AI/ML use cases at the BMW Group to make necessary customization as per their business and technical needs.

The MLOps solution helped the AI/ML use cases of the BMW Group to build and deploy production grade models, thereby reducing overall time to market by approximately 75%. The MLOps solution also offers a broad range of benefits to the BMW Group, including:

Learn more about BMWs Cloud Data Hub (CDH) in this blog post, AWS offerings at the AWS for Automotive page or contact your AWS team today.

See the original post:
Accelerating industrialization of Machine Learning at BMW Group using the Machine Learning Operations (MLOps ... - AWS Blog

Related Posts

Comments are closed.