I needed to create a Flask server to send data to Elasticsearch, so I built it using docker-compose
Directory structure
fl/ ├ Dockerfile ├ docker-compose.yml └ src/ ├app.py └requirements.txt
First description of Dockerfile Setting up a file that describes the python version, project directory, and required libraries
FROM python:3.6
ARG project_dir=/projects/
ADD src/requirements.txt $project_dir
WORKDIR $project_dir
RUN pip install -r requirements.txt
Next, write docker-compose.yml Described port setting and flask run command --with-threds is an option to set whether to run in multithreading
docker-compose.yml
version: '3'
services:
flask:
build: .
ports:
- "6000:6000"
volumes:
- "./src:/projects"
tty: true
environment:
TZ: Asia/Tokyo
command: flask run --host 0.0.0.0 --port 6000 --with-threads
Description of requirements.txt Describe the library to be installed with pip
requirements.txt
flask
elasticsearch
Describe the actual server
app.py
from flask import Flask, jsonify, request
from elasticsearch import Elasticsearch
from datetime import datetime
app = Flask(__name__)
#Elasticsearch host, port and index name
host = "192.168.xx"
port = "9200"
index_name = "XXXXX"
@app.route('/', methods=['POST']) #Enter the URL path and allow only POST
def insert():
data = json.loads(request.data) #Get the data thrown into flask
doc = {
"location" : data["location"]
}
es = Elasticsearch(
hosts = [{'host': host, 'port' : port}]
)
res = es.index(index=index_name, body=doc)
if __name__ == '__main__':
app.run()
Go to the fl / directory and
$ docker-compose up -d --build
Start with
If you want to drop it once for correction etc.
$ docker-compose down --rmi all --volumes
Image can be deleted with
Recommended Posts