Logo video2dn
  • Сохранить видео с ютуба
  • Категории
    • Музыка
    • Кино и Анимация
    • Автомобили
    • Животные
    • Спорт
    • Путешествия
    • Игры
    • Люди и Блоги
    • Юмор
    • Развлечения
    • Новости и Политика
    • Howto и Стиль
    • Diy своими руками
    • Образование
    • Наука и Технологии
    • Некоммерческие Организации
  • О сайте

Скачать или смотреть How to Deploy ML model to AWS Sagemaker with mlflow and Docker - Step by step

  • Data Science Garage
  • 2021-09-14
  • 15252
How to Deploy ML model to AWS Sagemaker with mlflow and Docker - Step by step
deploy ml modelaws sagemaker mlflowmlflow sagemakerbuild-and-push-containerdeploy model sagemakeraws sagemaker pythondeploy ml model on awsmlflow docker containeraws ecrelastic container registrymlflow sagemaker deployaws iam rolesmlflow tutorial pythonaws docker tutorial
  • ok logo

Скачать How to Deploy ML model to AWS Sagemaker with mlflow and Docker - Step by step бесплатно в качестве 4к (2к / 1080p)

У нас вы можете скачать бесплатно How to Deploy ML model to AWS Sagemaker with mlflow and Docker - Step by step или посмотреть видео с ютуба в максимальном доступном качестве.

Для скачивания выберите вариант из формы ниже:

  • Информация по загрузке:

Cкачать музыку How to Deploy ML model to AWS Sagemaker with mlflow and Docker - Step by step бесплатно в формате MP3:

Если иконки загрузки не отобразились, ПОЖАЛУЙСТА, НАЖМИТЕ ЗДЕСЬ или обновите страницу
Если у вас возникли трудности с загрузкой, пожалуйста, свяжитесь с нами по контактам, указанным в нижней части страницы.
Спасибо за использование сервиса video2dn.com

Описание к видео How to Deploy ML model to AWS Sagemaker with mlflow and Docker - Step by step

This video tutorial demonstrates how to deploy your Machine Learning (ML) model to AWS Sagemaker with mlflow and with Docker Desktop application. With this course I tried to explain everything in detail without any video cuts or interrupting, including even the smallest steps which you must do to finish this tutorial successfully.

To finish this tutorial you will need:
mlflow (recommended version 1.18.0). You can install it by typing command in your terminal: pip install mlflow==1.18.0
Docker Desktop application. You can download it from the official website: https://www.docker.com/products/docke...
Anaconda software, to create a conda environment with Python 3.6 kernel. You can download it from official website: https://www.anaconda.com/products/ind...

In this tutorial we will use:
MLflow: it is an open source platform to manage the ML lifecycle, including experimentation, reproducibility, deployment, and a central model registry. Check official website here: https://www.mlflow.org
AWS Elastic Container Registry (AWS ECR): is a fully managed container registry that makes it easy to store, manage, share, and deploy your container images and artifacts anywhere (https://aws.amazon.com/ecr/).
AWS SageMaker: this service helps data scientists and developers to prepare, build, train, and deploy high-quality machine learning models quickly by bringing together a broad set of capabilities purpose-built for machine learning.
AWS IAM (Identity and Management): it enables you to manage access to AWS services and resources securely (https://aws.amazon.com/iam).
AWS CLI (Command Line Interface): is a unified tool to manage your AWS services (https://aws.amazon.com/cli).

The main idea of this stream is to make your ML model readable for MLflow User Interface (MLflow UI), then you will be able to track model performance across experiments. By using MLflow functionality, you will create two Docker images: the first one will be placed locally, and another one on AWS ECR, where a special repository will be created for our image containing all information about our ML model. Then we use this image on AWS ECR to deploy our ML model to the AWS SageMaker. Remember to add required IAM roles and permissions to SageMaker and S3 where all model artifacts will be saved. Finally, we will be able to use the model and make new predictions with new data ingested to the model from anywhere using Python scripts.

The content of the tutorial:
0:00 - Intro
0:43 - P1. Prepare you Python virtual environment
2:26 - P2. Install dependencies on your virtual environment
5:47 - P3. Setup AWS IAM user and AWS CLI configuration
12:29 - P4. Test if mlflow is working good
14:09 - P5. Adapt your ML training code for mlflow
25:24 - P6. Build a Docker Image and push it to AWS ECR
33:50 - P7. Deploy Image from AWS ECR to AWS SageMaker
49:18 - P8. Use the deployed model with the new data and make predictions
52:46 - Bonus: Github repo of this tutorial and Thank you!

At the end of this lesson, you will be able to make predictions on your ML model from anywhere using boto3 using model inference and endpoints you have built on AWS Sagemaker.

The full explained steps are clearly written with screenshots in this repo: https://github.com/vb100/deploy-ml-ml...

If you need any clarifications and add more details on any step, let me know.

#mlflow #sagemaker #docker

Комментарии

Информация по комментариям в разработке

Похожие видео

  • О нас
  • Контакты
  • Отказ от ответственности - Disclaimer
  • Условия использования сайта - TOS
  • Политика конфиденциальности

video2dn Copyright © 2023 - 2025

Контакты для правообладателей [email protected]