This article is a continuation of ** (Note) Web application ** that uses TensorFlow to infer recommended song names. .. Create a TensorFlow + Keras environment in the local environment with docker-compose, I would like to organize up to the point where I hit the WEB API. Please note that this is an article I made for myself, so it may be difficult to understand, information, and technology may be out of date: bow: Also, I hope it will be helpful for those who want to make some kind of web application by themselves.
The actual web application looks like the GIF below. When I typed in a sentence in the search box, Mr. Humberd Humberd answered "same story": clap: $ \ tiny {* Since there is little learning data, only some songs will be hit. .. It's shabby} $: bow_tone1: $ \ tiny {* Click the score link to see part of the score, but it is out of the scope of the article} $: no_good_tone1:
I used it as a reference when creating this article: bow_tone1:
** (Note) A continuation of the web application ** that uses TensorFlow to infer recommended song names. This time, it is ** environment construction (execution environment) ** on the Web API side.
chapter | Classification | Status | Contents | Language, FW, environment, etc. |
---|---|---|---|---|
Preface | Common | Already | App overview | Python TensorFlow Keras Google Colaboratory |
chapter One | Web API | Already | (This time) Environment construction (execution environment) | docker-compose Flask Nginx gunicorn |
Chapter II | Web API | Already | Machine learning | Python TensorFlow Keras Flask |
Chapter 3 | screen | not started yet | Environment | Python Django Nginx gunicorn PostgreSQL virtualenv |
Chapter 4 | screen | not started yet | Display, Web API call part | Python Django |
Chapter 5 | AWS | not started yet | AWS auto-deploy | Github EC2 CodeDeploy CodePipeline |
Ubuntu version
$ cat /etc/os-release
NAME="Ubuntu"
VERSION="18.04.4 LTS (Bionic Beaver)"
ID=ubuntu
ID_LIKE=debian
PRETTY_NAME="Ubuntu 18.04.4 LTS"
VERSION_ID="18.04"
HOME_URL="https://www.ubuntu.com/"
SUPPORT_URL="https://help.ubuntu.com/"
BUG_REPORT_URL="https://bugs.launchpad.net/ubuntu/"
PRIVACY_POLICY_URL="https://www.ubuntu.com/legal/terms-and-policies/privacy-policy"
VERSION_CODENAME=bionic
UBUNTU_CODENAME=bionic
Docker version
$ docker version
Client: Docker Engine - Community
Version: 19.03.8
API version: 1.40
Go version: go1.12.17
Git commit: afacb8b7f0
Built: Wed Mar 11 01:25:46 2020
OS/Arch: linux/amd64
Experimental: false
Server: Docker Engine - Community
Engine:
Version: 19.03.8
API version: 1.40 (minimum version 1.12)
Go version: go1.12.17
Git commit: afacb8b7f0
Built: Wed Mar 11 01:24:19 2020
OS/Arch: linux/amd64
Experimental: false
containerd:
Version: 1.2.13
GitCommit: 7ad184331fa3e55e52b890ea95e65ba581ae3429
runc:
Version: 1.0.0-rc10
GitCommit: dc9208a3303feef5b3839f4323d9beb36df0a9dd
docker-init:
Version: 0.18.0
GitCommit: fec3683
Docker-Compose version
$ docker-compose version
docker-compose version 1.25.5, build unknown
docker-py version: 4.2.0
CPython version: 3.7.4
OpenSSL version: OpenSSL 1.1.1c 28 May 2019
* For some reason build unknown. I gave up because it seemed to take time: sob: </ sup>
To put it simply, docker-compose is installed properly, If you place the necessary files according to the [Directory structure](# directory structure) described below, Just run the following command:
Build&Start with background
docker-compose up -d --build
It takes time to build the first time, but after the container is created, if you hit the Web API, it will look like the GIF below.
Web_API execution example
http://localhost:7020/recommend/api/what-music/A song that is sad and wishes for someone's happiness
Tools is various, so I think anything is fine, but like GIF It will be returned in JSON.
I'm making it properly $ \ tiny {* Don't stare at it} $: no_good_tone1: There are a lot of garbage files, but they are on Github. Source
Directory structure
dk_tensor_fw
├── app_tensor
│ ├── Dockerfile
│ ├── exeWhatMusic.py
│ ├── inputFile
│ │ └── ans_studyInput_fork.txt
│ ├── mkdbAndStudy.py
│ ├── requirements.txt
│ ├── studyModel
│ │ ├── genre-model.hdf5
│ │ ├── genre-tdidf.dic
│ │ ├── genre.pickle
│ ├── tfidfWithIni.py
│ └── webQueApiRunServer.py
├── docker-compose.yml
├── web_nginx
├── Dockerfile
└── nginx.conf
docker-compose.yml
version: '3'
services:
###########App server settings###########
app_tensor:
container_name: app_tensor
#Service restart policy
restart: always
#Directory containing docker files to build
build: ./app_tensor
volumes:
#Directory to mount
- ./app_tensor:/dk_tensor_fw/app_tensor
ports:
#Host side port: Container side port
- 7010:7010
networks:
- nginx_network
###########App server settings###########
###########Web server settings###########
web-nginx:
container_name: web-nginx
build: ./web_nginx
volumes:
#Directory to mount
- ./web_nginx:/dk_tensor_fw/web_nginx
ports:
#Port forwarding from 7020 on the host PC to 7020 on the container
- 7020:7020
depends_on:
#Specify the dependency. web-app before starting server-Will start the server
- app_tensor
networks:
- nginx_network
###########Web server settings###########
networks:
nginx_network:
driver: bridge
(Reference) How to check free ports
#Check free ports (free if nothing is displayed)
netstat -an | grep 7010
Dockerfile ← Ap server side(Gunicorn)
FROM ubuntu:18.04
WORKDIR /dk_tensor_fw/app_tensor
COPY requirements.txt /dk_tensor_fw/app_tensor
RUN apt-get -y update \
&& apt-get -y upgrade \
&& apt-get install -y --no-install-recommends locales curl python3-distutils vim ca-certificates \
&& curl https://bootstrap.pypa.io/get-pip.py -o get-pip.py \
&& python3 get-pip.py \
&& pip install -U pip \
&& localedef -i en_US -c -f UTF-8 -A /usr/share/locale/locale.alias en_US.UTF-8 \
&& apt-get clean \
&& rm -rf /var/lib/apt/lists/* \
&& pip install -r requirements.txt --no-cache-dir
ENV LANG en_US.utf8
CMD ["gunicorn", "webQueApiRunServer:app", "-b", "0.0.0.0:7010"]
requirements.txt
Flask==1.1.0
gunicorn==19.9.0
Keras>=2.2.5
numpy==1.16.4
pandas==0.24.2
pillow>=6.2.0
python-dateutil==2.8.0
pytz==2019.1
PyYAML==5.1.1
requests==2.22.0
scikit-learn==0.21.2
sklearn==0.0
matplotlib==3.1.1
tensorboard>=1.14.0
tensorflow>=1.14.0
mecab-python3==0.996.2
Dockerfile ← Web server side(Nginx)
FROM nginx:latest
RUN rm /etc/nginx/conf.d/default.conf
COPY nginx.conf /etc/nginx/conf.d
nginx.conf
upstream app_tensor_config {
#If you specify the service name of the container, the name will be resolved.
server app_tensor:7010;
}
server {
listen 7020;
root /dk_tensor_fw/app_tensor/;
server_name localhost;
location / {
try_files $uri @flask;
}
location @flask {
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
proxy_redirect off;
proxy_pass http://app_tensor_config;
}
# redirect server error pages to the static page /50x.html
error_page 500 502 503 504 /50x.html;
location = /50x.html {
root /usr/share/nginx/html;
}
#Static file request is routed statically ← It is unnecessary because it is not used.
location /static/ {
alias /dk_tensor_fw/app_tensor/satic/;
}
}
$ docker-compose up -d --build
$ docker-compose images
Container Repository Tag Image Id Size
-----------------------------------------------------------------------
app_tensor dk_tensor_fw_app_tensor latest 3b916ea797e0 2.104 GB
web-nginx dk_tensor_fw_web-nginx latest 175c2596bb8b 126.8 MB
Is it bad to make? It seems that the capacity is quite large: sweat: </ sup>
$ docker-compose ps
Name Command State Ports
------------------------------------------------------------------------------------
app_tensor gunicorn webQueApiRunServe ... Up 0.0.0.0:7010->7010/tcp
web-nginx nginx -g daemon off; Up 0.0.0.0:7020->7020/tcp, 80/tcp
$ docker-compose exec app_tensor /bin/bash
root@ba0ce565430c:/dk_tensor_fw/app_tensor#
I put it in a container on the Ap server side. .. ..
I omitted some because the display of the output result is long: sweat: </ sup>
root@ba0ce565430c:/dk_tensor_fw/app_tensor# pip3 list
Package Version
---------------------- -----------
absl-py 0.9.0
Flask 1.1.0
gunicorn 19.9.0
Keras 2.3.1
Keras-Applications 1.0.8
Keras-Preprocessing 1.1.2
matplotlib 3.1.1
mecab-python3 0.996.2
numpy 1.16.4
pandas 0.24.2
Pillow 7.1.2
pip 20.1
python-dateutil 2.8.0
pytz 2019.1
PyYAML 5.1.1
requests 2.22.0
requests-oauthlib 1.3.0
rsa 4.0
scikit-learn 0.21.2
six 1.14.0
sklearn 0.0
tensorboard 2.2.1
tensorboard-plugin-wit 1.6.0.post3
tensorflow 2.2.0
tensorflow-estimator 2.2.0
(abridgement)
It seems that TensorFlow, Keras, etc. are all included. .. ..
$ docker-compose exec web-nginx /bin/bash
root@d6971e4dc05c:/#
I also put it in the container on the web server side.
root@d6971e4dc05c:/# /etc/init.d/nginx status
[ ok ] nginx is running.
It seems that Nginx is also running. I have confirmed the execution environment so far. If you hit the WEB API as shown in the above [Web API execution example](# web-api execution example), you should have an execution environment on the WEB API side. .. ..
This time, I was able to organize the execution environment on the WEB API side a little. Also, I hope I can brush up and organize it little by little when I have time: sob: It is undecided, but next time I would like to organize the machine learning part.
chapter | Classification | Status | Contents | Language, FW, environment, etc. |
---|---|---|---|---|
Preface | Common | Already | App overview | Python TensorFlow Keras Google Colaboratory |
chapter One | Web API | Already | Environment construction (execution environment) | docker-compose Flask Nginx gunicorn |
Chapter II | Web API | Already | Machine learning | Python TensorFlow Keras Flask |
Chapter 3 | screen | not started yet | Environment | Python Django Nginx gunicorn PostgreSQL virtualenv |
Chapter 4 | screen | not started yet | Display, Web API call part | Python Django |
Chapter 5 | AWS | not started yet | AWS auto-deploy | Github EC2 CodeDeploy CodePipeline |
Recommended Posts