I made an API with Docker that returns the predicted value of the machine learning model

Introduction

I wanted to use the machine learning model [^ 1] that I made while checking the accuracy with Yellowbrick in the form of a prediction API server, so I made it with Docker. Refer to here [^ 2] for Flask and here [^ 3] for Docker, and aim for the response from the API server.

environment

The environment is as follows.

$sw_vers
ProductName:	Mac OS X
ProductVersion:	10.13.6
BuildVersion:	17G8037

For the installation of Docker, I referred to [^ 4]. The log was quite long, so some parts were omitted.

$docker version
Client: Docker Engine - Community
 Version:           19.03.4
 API version:       1.40
 Go version:        go1.12.10
(abridgement)
Server: Docker Engine - Community
 Engine:
  Version:          19.03.4
  API version:      1.40 (minimum version 1.12)
(abridgement)

Build

Creating a Docker image

First, create a Docker image from the Dockerfile. Create a file with the following contents in an editor.

FROM ubuntu:latest

RUN apt-get update
RUN apt-get install python3 python3-pip -y

RUN pip3 install flask
RUN pip3 install scikit-learn
RUN pip3 install numpy
RUN pip3 install scipy
RUN pip3 install lightgbm
RUN pip3 install joblib
RUN pip3 install pandas

Create an image with the docker command.

docker build . -t myflask/mlapi:1.0

After a while, the following message is displayed and the construction is completed.

Successfully built 4a82ed953436
Successfully tagged myflask/mlapi:1.0

Confirm that the docker image is created with the docker images command.

$docker images
REPOSITORY          TAG                 IMAGE ID            CREATED             SIZE
myflask/mlapi       1.0                 4a82ed953436        7 minutes ago       782MB
ubuntu              latest              775349758637        2 days ago          64.2MB

Try entering the docker image and see if the library can be imported. Confirmed that both python3 and lightgbm libraries can be used.

root@117483d4b9ed:/# python3
Python 3.6.8 (default, Oct  7 2019, 12:59:55) 
[GCC 8.3.0] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> import lightgbm
>>> exit()

Program creation for API server

Exit from the docker image and create a program for the API server in the local environment. When I created the machine learning model, I saved the model with joblib [^ 1], so I also used joblib to load the model. I refer to the article here [^ 2], but since the amount of data items is different, I devised the receiving part. Once, the contents of features are passed to the DataFrame of pandas to predict.


from joblib import load
import flask
import numpy as np
import pandas as pd
import lightgbm as lgb

# initialize our Flask application and pre-trained model
app = flask.Flask(__name__)
model = None

def load_model():
    global model
    model = load("./lightgbm.joblib")

@app.route("/predict", methods=["POST"])
def predict():
    response = {
        "success": False,
        "Content-Type": "application/json"
    }

    if flask.request.method == "POST":
        if flask.request.get_json().get("feature"):

            # read feature from json and convert to dataframe
            features = flask.request.get_json().get("feature")
            df_X = pd.DataFrame.from_dict(features)

            # predict
            response["prediction"] = model.predict(df_X).tolist()

            # indicate that the request was a success
            response["success"] = True
    # return the data dictionary as a JSON response
    return flask.jsonify(response)

if __name__ == "__main__":
    load_model()
    print("Server is running ...")
    app.run(host='0.0.0.0', port=5000)

API server start

Create a directory in the local environment to store the above program and machine learning model.

mkdir vol
mv lightgbm.joblib ./vol/
mv api.py ./vol/

Mount the vol directory created above with the docker command on the docker image and start it.

$docker run -it --rm -p 5000:5000 -v $(pwd)/vol:/home myflask/mlapi:1.0 /bin/bash

When you execute api.py under home, the API server starts.

root@5d1e3cf74246:/# cd home/
root@5d1e3cf74246:/home# python3 api.py 
Server is running ...
 * Serving Flask app "api" (lazy loading)
 * Environment: production
   WARNING: This is a development server. Do not use it in a production deployment.
   Use a production WSGI server instead.
 * Debug mode: off
 * Running on http://0.0.0.0:5000/ (Press CTRL+C to quit)

Confirmation of response

Since the server is set to IP address 0.0.0.0 and port 5000 above, let's pass the data to predict with the curl command here. Since the training data uses Yellowbrick's load_bikeshare, the items are aligned, packed in features, and passed to the server. I have listed the elements of the data so that it can be converted to a pandas DataFrame as it is.

$curl http://0.0.0.0:5000/predict -X POST -H 'Content-Type:application/json' -d '{"feature":{"season":[1], "year":[0], "month":[1], "hour":[0], "holiday":[0], "weekday":[6], "workingday":[0], "weather":[1], "temp":[0.24], "feelslike":[0.3], "humidity":[0.8], "windspeed":[0.0]}}'

When I typed the above command, the response came back as below. Yay.

{"Content-Type":"application/json","prediction":[34.67747315059312],"success":true}

at the end

Even if you are not familiar with REST API or server, you can easily build it by using Docker or Flask. Next time, I would like to make something like throwing a prediction from a web page and displaying the returned value.

Recommended Posts

I made an API with Docker that returns the predicted value of the machine learning model
I tried calling the prediction API of the machine learning model from WordPress
REST API of model made with Python with Watson Machine Learning (CP4D edition)
With LINEBot, I made an app that informs me of the "bus time"
I want to create an API that returns a model with a recursive relationship in the Django REST Framework
An example of a mechanism that returns a prediction by HTTP from the result of machine learning
I made a GAN with Keras, so I made a video of the learning process.
I tried to visualize the model with the low-code machine learning library "PyCaret"
I made a LINE BOT that returns a terrorist image using the Flickr API
Implementation of a model that predicts the exchange rate (dollar-yen rate) by machine learning
I'm an amateur on the 14th day of python, but I want to try machine learning with scikit-learn
Predict the gender of Twitter users with machine learning
Summary of the basic flow of machine learning with Python
Record of the first machine learning challenge with Keras
I made a twitter app that decodes the characters of Pricone with heroku (failure)
I made an IFTTT button that unlocks the entrance 2 lock sesame with 1 button (via AWS Lambda)
I made a LINE BOT that returns parrots with Go
Try to evaluate the performance of machine learning / regression model
PyTorch learning memo (I made the same model as Karas)
Try to evaluate the performance of machine learning / classification model
I made a function to check the model of DCGAN
The procedure from generating and saving a learning model by machine learning, making it an API server, and communicating with JSON from a browser
I made an anomaly detection model that works on iOS
[Machine learning] I tried to summarize the theory of Adaboost
I made an original program guide using the NHK program guide API.
Building an auto-sklearn environment that semi-automates machine learning (Mac & Docker)
I tried to build an estimation model of article titles that are likely to buzz with Qiita
I tried to make Othello AI with tensorflow without understanding the theory of machine learning ~ Introduction ~
I made a Line bot that guesses the gender and age of a person from an image
I tried to make Othello AI with tensorflow without understanding the theory of machine learning ~ Implementation ~
A story stuck with the installation of the machine learning library JAX
Create an API that returns data from a model using turicreate
[Machine learning] Check the performance of the classifier with handwritten character data
I made a slack bot that notifies me of the temperature
Machine learning model management to avoid quarreling with the business side
Create an arbitrary machine learning environment with GCP + Docker + Jupyter Lab
I made a program that automatically calculates the zodiac with tkinter
I tried machine learning with liblinear
Validate the learning model with Pylearn2
Note that I understand the algorithm of the machine learning naive Bayes classifier. And I wrote it in Python.
I made an npm package to get the ID of the IC card with Raspberry Pi and PaSoRi
I tried to make Othello AI with tensorflow without understanding the theory of machine learning ~ Battle Edition ~
A story that visualizes the present of Qiita with Qiita API + Elasticsearch + Kibana
I made a calendar that automatically updates the distribution schedule of Vtuber
I tried to get the authentication code of Qiita API with Python.
I tried using the trained model VGG16 of the deep learning library Keras
I tried to get the movie information of TMDb API with Python
I made a mistake in fetching the hierarchy with MultiIndex of pandas
Get the return value of an external shell script (ls) with python3
I made an APL part with the Alexa skill "Industry Terminology Conversion"
I tried to predict the behavior of the new coronavirus with the SEIR model.
Judge the authenticity of posted articles by machine learning (Google Prediction API).
Access the Docker Remote API with Requests
I checked the contents of docker volume
I made a system with Raspberry Pi that regularly measures the discomfort index of the room and sends a LINE notification if it is a dangerous value
I made you to express the end of the IP address with L Chika
I made an appdo command to execute a command in the context of the app
Create an app that works well with people's reports using the COTOHA API
I tried to predict the presence or absence of snow by machine learning.
Looking back on the machine learning competition that I worked on for the first time
The story of the learning method that acquired LinuC Level 1 with only ping -t