--Building a data analysis environment using Docker --Checking the connection between python and postgresql with docker-compose
Is doing
For postgresql, we use the official Docker image. The anaconda environment creates a Dockerfile based on your own image.
FROM kyobad/anaconda3-alpine
MAINTAINER K.Kato
RUN conda install -y seaborn psycopg2 networkx\
&& pip install chainer
WORKDIR /root/notebook
CMD ["/bin/sh", "-c", "jupyter notebook --no-browser --port=8888 --ip=0.0.0.0"]
This is an image of installing an analysis environment such as seaborn and psycopg2 based on the anaconda environment.
Create a docker-compose configuration file to connect this python environment to postgresql.
docker-compose.yml
version: '2'
services:
db:
image:
postgres
restart:
always
environment:
POSTGRES_USER:
pythonist
POSTGRES_PASSWORD:
password
POSTGRES_DB:
mypyenv
ports:
- "5432"
container_name:
postgres-db
volumes:
- ./postgres-db:/var/lib/postgresql/data
- ./docker-entrypoint-initdb.d:/docker-entrypoint-initdb.d
python:
build: .
volumes:
- ./workdir:/root/notebook/
ports:
- "8888:8888"
links:
- db
depends_on:
- db
The user name, password, etc. are set in the db environment. Please change these as appropriate. postgres-db / set in volume mounts postgres data locally and backs it up (is it really desirable to create a dataonly container?) docker-entrypoint-initdb.d is the default setting when starting the container. If you put sql files such as table creation here, they will be done when the container is started. The python volume is a mount for saving the launched notebook locally.
In the directory where you put these,
docker-compose up
By executing the command, the postgres container and the jupyter notebook container will be launched.
From your browser, you can access localhost: 8888 and access the db with the db name and password written in the compose file.
--By using Docker, you can easily build and reconfigure your analysis environment. --Backup & portable by mounting notebook and postgres data locally