Deploying Machine Learning Models as Microservices Using Docker

Deploying Machine Learning Models as Microservices Using Docker image

A REST-Based Architecture for Serving ML Model Outputs at Scale

Publisher: O'Reilly Media

Release Date: December 2017

Duration: 0 hours 24 minutes

Watch on Safari with a 10-day trial

Start your free trial now

Modern applications running in the cloud often rely on REST-based microservices architectures by using Docker containers. Docker enables your applications to communicate between one another and to compose and scale various components. Data scientists use these techniques to efficiently scale their machine learning models to production applications. This video teaches you how to deploy machine learning models behind a REST API—to serve low latency requests from applications—without using a Spark cluster. In the process, you'll learn how to export models trained in SparkML; how to work with Docker, a convenient way to build, deploy, and ship application code for microservices; and how a model scoring service should support single on-demand predictions and bulk predictions. Learners should have basic familiarity with the following: Scala or Python; Hadoop, Spark, or Pandas; SBT or Maven; cloud platforms like Amazon Web Services; Bash, Docker, and REST.

  • Understand how to deploy machine learning models behind a REST API
  • Learn to utilize Docker containers for REST-based microservices architectures
  • Explore methods for exporting models trained in SparkML using a library like Combust MLeap
  • See how Docker builds, deploys, and ships application code for microservices
  • Discover how to deploy a model using exported PMML with a REST API in a Docker container
  • Learn to use the AWS elastic container service to deploy a model hosting server in Docker
  • Pick up techniques that enable a model hosting server to read a model

Mikhail Semeniuk is a data engineer with Shift Technologies. Mikhail worked for six years as a senior level statistician for UnitedHealth Group, the largest health insurance provider in the United States. He holds a BS in Economics and Financial Math from the University of Minnesota.

Jason Slepicka is a senior data engineer with DataScience. Jason is working on his PhD in Computer Science at the University of Southern California Information Sciences Institute.

Table of Contents

Chapter: Deploying Machine Learning Models as Microservices Using Docker


00m 58s

Overview of microservice architecture and REST APIs for model prediction

04m 54s

Deploying model behind a REST API in Docker Container

06m 35s

Making single and batch predictions via REST API

06m 7s

Overview of concerns for managing REST APIs

05m 53s

  • Publication date: 13.12.2017
  • Source