Seminarinhalt
In diesem Training werden Sie lernen:
- Explain the benefits of MLOps
- Compare and contrast DevOps and MLOps
- Evaluate the security and governance requirements for an ML use case and describe possible solutions and mitigation strategies
- Set up experimentation environments for MLOps with Amazon SageMaker
- Explain best practices for versioning and maintaining the integrity of ML model assets (data, model, and code)
- Describe three options for creating a full CI/CD pipeline in an ML context
- Recall best practices for implementing automated packaging, testing and deployment. (Data/model/code)
- Demonstrate how to monitor ML based solutions
- Demonstrate how to automate an ML solution that tests, packages, and deploys a model in an automated fashion; detects performance degradation; and re-trains the model on top of newly acquired data
Programm
Module 1: Introduction to MLOps
- Processes
- People
- Technology
- Security and governance
- MLOps maturity model
- Bringing MLOps to experimentation
- Setting up the ML experimentation environment
- Demonstration: Creating and Updating a Lifecycle Configuration for SageMaker Studio
- Hands-On Lab: Provisioning a SageMaker Studio Environment with the AWS Service Catalog
- Workbook: Initial MLOps Module 3: Repeatable MLOps: Repositories
- Managing data for MLOps
- Version control of ML models
- Code repositories in ML
- ML pipelines
- Demonstration: Using SageMaker Pipelines to Orchestrate Model Building Pipelines
- End-to-end orchestration with AWS Step Functions
- Hands-On Lab: Automating a Workflow with Step Functions
- End-to-end orchestration with SageMaker Projects
- Demonstration: Standardizing an End-to-End ML Pipeline with SageMaker Projects
- Using third-party tools for repeatability
- Demonstration: Exploring Human-in-the-Loop During Inference
- Governance and security
- Demonstration: Exploring Security Best Practices for SageMaker
- Workbook: Repeatable MLOps
- Scaling and multi-account strategies
- Testing and traffic-shifting
- Demonstration: Using SageMaker Inference Recommender
- Hands-On Lab: Testing Model Variants
Module 5: Reliable MLOps: Scaling and Testing (continued)
- Hands-On Lab: Shifting Traffic
- Workbook: Multi-account strategies
- The importance of monitoring in ML
- Hands-On Lab: Monitoring a Model for Data Drift
- Operations considerations for model monitoring
- Remediating problems identified by monitoring ML solutions
- Workbook: Reliable MLOps
- Hands-On Lab: Building and Troubleshooting an ML Pipeline
Zielgruppen
- MLOps-Ingenieure, die ML-Modelle in der AWS-Cloud produktiv machen und überwachen möchten
- DevOps-Ingenieure, die für die erfolgreiche Bereitstellung und Wartung von ML Modelle in der Produktion
Vorkenntnisse
- AWS Technical Essentials (classroom or digital)
- DevOps Engineering on AWS, or equivalent experience
- Practical Data Science with Amazon SageMaker, or equivalent experience