Chalice and Lambda are new to me, I'm excited about my first serverless application so I'll write it
Source code naokit-dev / python3_chalice_on_docker
Python framework provided by AWS Easily deploy serverless applications with Lambda Documentation — AWS Chalice
docker --version
Docker version 19.03.13, build 4484c46d9d
docker-compose --version
docker-compose version 1.27.4, build 40524192
In addition, ** AWS access key is required **
Check the image used by Docker Hub python - Docker Hub
AWS Chalice can use all python supported by Lambda, but 3 series is recommended
AWS Chalice supports all versions of python supported by AWS Lambda, which includes python2.7, python3.6, python3.7, python3.8. We recommend you use a version of Python 3. Quickstart — AWS Chalice
Here, I will try using 3.8-alpine
Create a new workspace in VS Code I made it "python3_chalice_on_docker"
(I don't need the following steps, but I tried it as the minimum configuration that python works)
Pull the Docker Hub image and launch the container
--- it Attach to standard input ---- rm Delete container when exiting container --- v <host_path>: <container_path> Mount host_path as a volume
docker run -it --rm -v $PWD:/python python:3.8-alpine /bin/sh
(If you try to mount with a relative path like -v.:/python
, you will get an error, but it seems that the environment variable $ PWD
can be used, and there is no problem with -v $ PWD: / python
| Volume relative I want to do docker run even if I specify the path! --Qiita)
# python --version
Python 3.8.6
You will be working in the workspace you created earlier
Create Dockerfile
Install chalice with pip install chalice
touch Dockerfile
FROM python:3.8-alpine
WORKDIR /app
RUN pip install chalice
CMD [ "/bin/sh"]
Next, create docker-compose.yml
In addition to port mapping and volume creation, environment variables described in .env
can be handled in the container.
touch docker-compose.yml
version: "3.8"
services:
app:
build: .
ports:
- "80:8000"
volumes:
- .:/app
command: chalice local --host=0.0.0.0 --port=8000
tty: true
stdin_open: true
working_dir: "${APP_PATH}"
environment:
- AWS_ACCESS_KEY_ID=${AWS_ACCESS_KEY_ID}
- AWS_SECRET_ACCESS_KEY=${AWS_SECRET_ACCESS_KEY}
- AWS_DEFAULT_REGION=${AWS_DEFAULT_REGION}
Create .env
Define environment variables here
ʻAPP_NAME` is now blank
Others can be left as they are for now, but you will be writing the credentials needed to deploy to AWS.
touch .env
APP_NAME=
APP_PATH=/app/${APP_NAME}
AWS_ACCESS_KEY_ID=[YOUR_ACCESS_KEY_ID]
AWS_SECRET_ACCESS_KEY=[YOUR_SECRET_ACCESS_KEY]
AWS_DEFAULT_REGION=ap-northeast-1
If AWS credentials are stored on the device You can check below
cat ~/.aws/credentials
Create a new project with chalice new-project <project_name>
docker-compose run app chalice new-project test_chalice
It has the following configuration
.
├── .env
├── Dockerfile
├── docker-compose.yml
└── test_chalice
├── .chalice
│ └── config.json
├── .gitignore
├── app.py
└── requirements.txt
Edit .env
Enter the project name in APP_NAME, and AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY here as well.
Region is set to ʻap-northeast-1`, but please change it accordingly.
APP_NAME=test_chalice
APP_PATH=/app/${APP_NAME}
AWS_ACCESS_KEY_ID=xxxxxxxxxxxxxxxxxx
AWS_SECRET_ACCESS_KEY=xxxxxxxxxxxxxxxxxx
AWS_DEFAULT_REGION=ap-northeast-1
Start local server
docker-compose up
I've overridden the command in docker-compose.yml as command: chalice local --host = 0.0.0.0 --port = 8000
, so chalice local
is executed when docker-compose up
I have mapped port 80 on the host side to port 8000 in the container with ports:-"80: 8000 "
, so when I access localhost
from the host, it will be port-forwarded to the local server of chalice.
curl localhost
{"hello":"world"}%
"hello world" is returned
Take a look at test_chalice / app.py
Uncomment the following commented out parts
# @app.route('/hello/{name}')
# def hello_name(name):
# # '/hello/james' -> {"hello": "james"}
# return {'hello': name}
When I try to access / hello / chalice
curl localhost/hello/chalice
{"hello":"chalice"}%
"hello chalice" is returned You can see RESTful behavior
Stop local server
docker-compose down
Deployed as an AWS Lambda function with chalice deploy
docker-compose run app chalice deploy
Creating deployment package.
Creating IAM role: test_chalice-dev
Creating lambda function: test_chalice-dev
Creating Rest API
Resources deployed:
- Lambda ARN: arn:aws:lambda:ap-northeast-1:xxxxxxxxxxxx:function:test_chalice-dev
- Rest API URL: https://xxxxxxxxxx.execute-api.ap-northeast-1.amazonaws.com/api/
Try to access the Rest API URL
curl https://xxxxxxxxxx.execute-api.ap-northeast-1.amazonaws.com/api/
{"hello":"world"}%
curl https://xxxxxxxxxx.execute-api.ap-northeast-1.amazonaws.com/api/hello/lambda
{"hello":"lambda"}%
chalice parses the code and It seems that it will deploy as a Lambda function with the required IAM role.
Next, I would like to try a slightly practical app.
Ref. Documentation — AWS Chalice
Recommended Posts