For my memorandum: sweat: It is a memo when creating an environment of TensorFlow + Flask + Nginx with Docker Compose. Just Notes on apps using TensorFlow I was making, but I decided to cut it out and organize it. .. .. If you follow this procedure, the Web API using TensorFlow should work: sweat: Please note that this is an article I made for myself, so it may be difficult to understand, information, and technology may be out of date: bow_tone1:
I used it as a reference when creating this article: bow_tone1:
Ubuntu version
$ cat /etc/os-release
NAME="Ubuntu"
VERSION="18.04.4 LTS (Bionic Beaver)"
ID=ubuntu
ID_LIKE=debian
PRETTY_NAME="Ubuntu 18.04.4 LTS"
VERSION_ID="18.04"
HOME_URL="https://www.ubuntu.com/"
SUPPORT_URL="https://help.ubuntu.com/"
BUG_REPORT_URL="https://bugs.launchpad.net/ubuntu/"
PRIVACY_POLICY_URL="https://www.ubuntu.com/legal/terms-and-policies/privacy-policy"
VERSION_CODENAME=bionic
UBUNTU_CODENAME=bionic
Docker version
$ docker version
Client: Docker Engine - Community
Version: 19.03.8
API version: 1.40
Go version: go1.12.17
Git commit: afacb8b7f0
Built: Wed Mar 11 01:25:46 2020
OS/Arch: linux/amd64
Experimental: false
Server: Docker Engine - Community
Engine:
Version: 19.03.8
API version: 1.40 (minimum version 1.12)
Go version: go1.12.17
Git commit: afacb8b7f0
Built: Wed Mar 11 01:24:19 2020
OS/Arch: linux/amd64
Experimental: false
containerd:
Version: 1.2.13
GitCommit: 7ad184331fa3e55e52b890ea95e65ba581ae3429
runc:
Version: 1.0.0-rc10
GitCommit: dc9208a3303feef5b3839f4323d9beb36df0a9dd
docker-init:
Version: 0.18.0
GitCommit: fec3683
Docker-Compose version
$ docker-compose version
docker-compose version 1.25.5, build unknown
docker-py version: 4.2.0
CPython version: 3.7.4
OpenSSL version: OpenSSL 1.1.1c 28 May 2019
* For some reason build unknown. I gave up because it seemed to take time: sob: </ sup>
I'm making it properly $ \ tiny {* Don't stare at it} $: no_good_tone1: There are a lot of garbage files, but they are on Github. Source
Directory structure
dk_tensor_fw
├── app_tensor
│ ├── Dockerfile
│ ├── exeWhatMusic.py
│ ├── inputFile
│ │ └── ans_studyInput_fork.txt
│ ├── mkdbAndStudy.py
│ ├── requirements.txt
│ ├── studyModel
│ │ ├── genre-model.hdf5
│ │ ├── genre-tdidf.dic
│ │ ├── genre.pickle
│ ├── tfidfWithIni.py
│ └── webQueApiRunServer.py
├── docker-compose.yml
├── web_nginx
├── Dockerfile
└── nginx.conf
docker-compose.yml
version: '3'
services:
###########App server settings###########
app_tensor:
container_name: app_tensor
#Service restart policy
restart: always
#Directory containing docker files to build
build: ./app_tensor
volumes:
#Directory to mount
- ./app_tensor:/dk_tensor_fw/app_tensor
ports:
#Host side port: Container side port
- 7010:7010
networks:
- nginx_network
###########App server settings###########
###########Web server settings###########
web-nginx:
container_name: web-nginx
build: ./web_nginx
volumes:
#Directory to mount
- ./web_nginx:/dk_tensor_fw/web_nginx
ports:
#Port forwarding from 7020 on the host PC to 7020 on the container
- 7020:7020
depends_on:
#Specify the dependency. web-app before starting server-Will start the server
- app_tensor
networks:
- nginx_network
###########Web server settings###########
networks:
nginx_network:
driver: bridge
(Reference) How to check free ports
#Check free ports (free if nothing is displayed)
netstat -an | grep 7010
Dockerfile ← Ap server side(Gunicorn)
FROM ubuntu:18.04
WORKDIR /dk_tensor_fw/app_tensor
COPY requirements.txt /dk_tensor_fw/app_tensor
RUN apt-get -y update \
&& apt-get -y upgrade \
&& apt-get install -y --no-install-recommends locales curl python3-distutils vim ca-certificates \
&& curl https://bootstrap.pypa.io/get-pip.py -o get-pip.py \
&& python3 get-pip.py \
&& pip install -U pip \
&& localedef -i en_US -c -f UTF-8 -A /usr/share/locale/locale.alias en_US.UTF-8 \
&& apt-get clean \
&& rm -rf /var/lib/apt/lists/* \
&& pip install -r requirements.txt --no-cache-dir
ENV LANG en_US.utf8
CMD ["gunicorn", "webQueApiRunServer:app", "-b", "0.0.0.0:7010"]
requirements.txt
Flask==1.1.0
gunicorn==19.9.0
Keras>=2.2.5
numpy==1.16.4
pandas==0.24.2
pillow>=6.2.0
python-dateutil==2.8.0
pytz==2019.1
PyYAML==5.1.1
requests==2.22.0
scikit-learn==0.21.2
sklearn==0.0
matplotlib==3.1.1
tensorboard>=1.14.0
tensorflow>=1.14.0
mecab-python3==0.996.2
By analogy with the following python source using a machine-learned model, The main body of the Web API that returns the Json response. Actual analogy module (exeWhatMusic) Is reading from the outside: sweat_smile:
webQueApiRunServer.py
import flask
import os
import exeWhatMusic
#port number
TM_PORT_NO = 7010
# initialize our Flask application and pre-trained model
app = flask.Flask(__name__)
app.config['JSON_AS_ASCII'] = False # <--Avoid garbled Japanese characters
@app.route('/recommend/api/what-music/<how_music>', methods=['GET'])
def get_recom_music(how_music):
recoMusicInfos = getRecoMusicMoji(how_music)
return flask.jsonify({'recoMusicInfos': recoMusicInfos})
#Returns the recommended song name
def getRecoMusicMoji(how_music):
recMusicName, predict_val = exeWhatMusic.check_genre(how_music)
#JSON creation
recoMusicInfoJson = [
{
'id':1,
'recoMusicMoji':recMusicName,
'predict_val':predict_val,
'how_music':how_music
}
]
return recoMusicInfoJson
if __name__ == "__main__":
print(" * Flask starting server...")
app.run(threaded=False, host="0.0.0.0", port=int(os.environ.get("PORT", TM_PORT_NO)))
Dockerfile ← Web server side(Nginx)
FROM nginx:latest
RUN rm /etc/nginx/conf.d/default.conf
COPY nginx.conf /etc/nginx/conf.d
nginx.conf
upstream app_tensor_config {
#If you specify the service name of the container, the name will be resolved.
server app_tensor:7010;
}
server {
listen 7020;
root /dk_tensor_fw/app_tensor/;
server_name localhost;
location / {
try_files $uri @flask;
}
location @flask {
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
proxy_redirect off;
proxy_pass http://app_tensor_config;
}
# redirect server error pages to the static page /50x.html
error_page 500 502 503 504 /50x.html;
location = /50x.html {
root /usr/share/nginx/html;
}
#Static file request is routed statically ← It is unnecessary because it is not used.
location /static/ {
alias /dk_tensor_fw/app_tensor/satic/;
}
}
$ docker-compose up -d --build
$ docker-compose images
Container Repository Tag Image Id Size
-----------------------------------------------------------------------
app_tensor dk_tensor_fw_app_tensor latest 3b916ea797e0 2.104 GB
web-nginx dk_tensor_fw_web-nginx latest 175c2596bb8b 126.8 MB
Is it bad to make? It seems that the capacity is quite large: sweat: </ sup>
$ docker-compose ps
Name Command State Ports
------------------------------------------------------------------------------------
app_tensor gunicorn webQueApiRunServe ... Up 0.0.0.0:7010->7010/tcp
web-nginx nginx -g daemon off; Up 0.0.0.0:7020->7020/tcp, 80/tcp
$ docker-compose exec app_tensor /bin/bash
root@ba0ce565430c:/dk_tensor_fw/app_tensor#
I put it in a container on the Ap server side. .. ..
I omitted some because the display of the output result is long: sweat: </ sup>
root@ba0ce565430c:/dk_tensor_fw/app_tensor# pip3 list
Package Version
---------------------- -----------
absl-py 0.9.0
Flask 1.1.0
gunicorn 19.9.0
Keras 2.3.1
Keras-Applications 1.0.8
Keras-Preprocessing 1.1.2
matplotlib 3.1.1
mecab-python3 0.996.2
numpy 1.16.4
pandas 0.24.2
Pillow 7.1.2
pip 20.1
python-dateutil 2.8.0
pytz 2019.1
PyYAML 5.1.1
requests 2.22.0
requests-oauthlib 1.3.0
rsa 4.0
scikit-learn 0.21.2
six 1.14.0
sklearn 0.0
tensorboard 2.2.1
tensorboard-plugin-wit 1.6.0.post3
tensorflow 2.2.0
tensorflow-estimator 2.2.0
(abridgement)
It seems that TensorFlow, Keras, etc. are all included. .. ..
$ docker-compose exec web-nginx /bin/bash
root@d6971e4dc05c:/#
I also put it in the container on the web server side.
root@d6971e4dc05c:/# /etc/init.d/nginx status
[ ok ] nginx is running.
It seems that Nginx is also running. I have confirmed the execution environment so far. If you hit the WEB API as shown below, you should have an execution environment on the WEB API side. .. ..
Web_API execution example
http://localhost:7020/recommend/api/what-music/A song that is sad and wishes for someone's happiness
Tools is various, so I think anything is fine, but like GIF It will be returned in JSON.
* Reference material is as it is. See reference materials for details: bow_tone1: </ sup>
$ docker-compose stop
$ docker-compose start
#Stop & delete
#Container network
docker-compose down
Container network image
docker-compose down --rmi all
#Container network volume
docker-compose down -v
Recommended Posts