FastAPI is a modern and explosive framework for creating APIs for python3.6 and above.
The main feature is
-** Speed **: Very high performance. It boasts processing speed comparable to NodeJS and Go (thanks to Starlette and Pydantic). One of the fastest Python frameworks out there.
-** Code simplification **: Increase code writing speed from about 2-3 times. (*)
-** Fewer bugs **: It is possible to reduce artificial code bugs by about 40%. (*)
-** Intuitive writing **: Full editor support and complementation. It is possible to reduce the time required for debugging.
-** Easy **: Designed to be easy to write and understand. You don't have to worry about taking a lot of time to read the document.
-** Short **: Avoid code duplication. It has a function that provides various functions just by changing the arguments to be passed.
-** Solid **: You can use code that is the same as the development environment even in the production environment.
-** Provide Swagger **: The created API is automatically documented based on the Swagger provided by default, and each process can be executed.
(*) According to the research by the FastAPI production team.
By saying, "Code speaks more than words," I would like to use it immediately.
First, create a suitable folder
mkdir fastapi-practice
Install the required packages.
pip install fastapi sqlalchemy uvicorn mysqlclient
If you don't like global installation, please install using poetry etc. (I will use poetry anyway after this).
Create the following files needed to run FastAPI
touch main.py
Then write the code as follows.
main.py
from fastapi import FastAPI
app = FastAPI()
@app.get("/")
def index():
return {'Hello': 'World'}
This alone will start the server.
uvicorn main:app
The server should have started just by typing.
http://localhost:8000/
When you try to display in a browser, you should see {" Hello ":" World "}
.
Detonation velocity, Fast API. Moreover, the API specifications are automatically created by Swagger! !! (surprise!) http://localhost:8000/docs Try to display. The specifications should be made with a stylish UI.
You can even open a tab and press the Try it out
button to actually send a request and see the response! (Impressed !!)
By the way http://localhost:8000/redoc Is automatically created and you can easily create more detailed documents! (Too amazing !!!)
Let's create a RESTful API with Fast API + MySQL assuming a real case.
FastAPI can be easily built with docker, so let's try to run mysql and FastAPI in the container and communicate with each other.
First, create docker-compose.yml, docker-sync.yml, and Dockerfile in the folder.
touch docker-compose.yml docker-sync.yml Dockerfile
I won't explain how to use docker in detail here, but the information for creating a container in Dockerfile, the command to run on the container created in docker-compose.yml, and the local development environment in docker-sync.yml. I'm going to write some code to sync the files in the docker container in real time.
As for how to use docker-sync, I think that the following articles written by other people will be helpful, so please read it. You can do it without using docker-sync, but I use it to make the sync speed explosive!
https://qiita.com/Satoshi_Numasawa/items/278a143aa41735e1b0da
Now let's write the code from the Dockerfile.
Dockerfile
FROM python:3.8-alpine
RUN apk --update-cache add python3-dev mariadb-dev gcc make build-base libffi-dev libressl-dev
WORKDIR /app
RUN pip install poetry
Use poetry for package management. There are also pipenv and pyflow for package management, so do you like this ...?
https://qiita.com/sk217/items/43c994640f4843a18dbe This article summarizes each package manager in an easy-to-understand manner. If you are interested, please take a look.
Then docker-sync.yml
docker-sync.yml
version: "2"
options:
verbose: true
syncs:
fastapi-practice-sync:
src: "."
notify_terminal: true
sync_strategy: "native_osx"
sync_userid: "1000"
sync_excludes: [".git", ".gitignore", ".venv"]
And docker-compose.yml
docker-compose.yml
version: "3"
services:
db:
image: mysql:latest
command: --default-authentication-plugin=mysql_native_password
restart: always
environment:
MYSQL_DATABASE: fastapi_practice_development
MYSQL_USER: root
MYSQL_PASSWORD: "password"
MYSQL_ROOT_PASSWORD: "password"
ports:
- "3306:4306"
volumes:
- mysql_data:/var/lib/mysql
fastapi:
build:
context: .
dockerfile: "./Dockerfile"
command: sh -c "poetry install && poetry run uvicorn main:app --reload --host 0.0.0.0 --port 8000"
ports:
- "8000:8000"
depends_on:
- db
volumes:
- fastapi-sync:/app:nocopy
- poetry_data:/root/.cache/pypoetry/
volumes:
mysql_data:
poetry_data:
fastapi-sync:
external: true
The trick with docker-compose.yml is to use the persistent data and the ad hoc data properly. Non-persistent data will be reset every time docker-compose down.
In this case, the data in mysql and the package installed by poetry are made persistent so that the data in mysql does not become empty or the package does not need to be downloaded every time the container is started.
Also, since I want to synchronize the code I write using docker-sync, I write fastapi-practice-sync: / app: nocopy
to prevent it from being synchronized without permission.
MySQL will also be built by pulling the latest image from docker.
This is the end of docker setup.
First, set up poetry to install the packages required for Fast API.
poetry init
At the terminal. Then the setup will start interactively, so hit yes or no repeatedly. (Since there is no problem with the basic default settings, I think that there is no problem even if you press Enter repeatedly ...)
Then I think that a file called pyproject.toml
was created.
Package dependency information will be added here, so let's install the packages required to launch the Fast API using poetry.
poetry add fastapi sqlalchemy uvicorn mysqlclient
Enter this and wait for the package installation to finish.
When you're done, open pyproject.toml
and you'll see information about the installed packages.
pyproject.toml
[tool.poetry]
name = "fastapi-practice"
version = "0.1.0"
description = ""
authors = ["Your Name <[email protected]>"]
[tool.poetry.dependencies]
python = "^3.8"
fastapi = "^0.61.1"
sqlalchemy = "^1.3.19"
uvicorn = "^0.11.8"
[tool.poetry.dev-dependencies]
[build-system]
requires = ["poetry>=0.12"]
build-backend = "poetry.masonry.api"
Then all you have to do is type docker-compose build
, build the image and enter docker-sync-stack start
!
docker-sync-stack start
is a command to execute docker-sync start
and docker-compose up
at the same time. It's convenient because it also gives you a nice log.When installing a new package, first install the package locally with poetry add
and restart the docker container and it should be synced inside the container as well!
Next, we will link with DB (MySQL).
Speaking of studying CRUD this time, creating a todo list! So I will define the Todo table and perform migration.
Todos Table
column | datatype |
---|---|
id | integer |
title | string |
content | string |
done | boolean |
We will migrate the table with such a configuration.
First, create a file that defines the database
touch db.py
I will write the following contents.
db.py
from sqlalchemy import Boolean, Column, ForeignKey, Integer, String, create_engine
from sqlalchemy.ext.declarative import declarative_base
from sqlalchemy.orm import relationship, sessionmaker, scoped_session
user_name = "root"
password = "password"
host = "db"
database_name = "fastapi_practice_development"
DATABASE = f'mysql://{user_name}:{password}@{host}/{database_name}'
engine = create_engine(
DATABASE,
encoding="utf-8",
echo=True
)
Base = declarative_base()
SessionLocal = sessionmaker(autocommit=False, autoflush=False, bind=engine)
class Todo(Base):
__tablename__ = 'todos'
id = Column(Integer, primary_key=True, autoincrement=True)
title = Column(String(30), nullable=False)
content = Column(String(300), nullable=False)
done = Column(Boolean, default=False)
def get_db():
db = SessionLocal()
try:
yield db
finally:
db.close()
def main():
Base.metadata.drop_all(bind=engine)
Base.metadata.create_all(bind=engine)
if __name__ == "__main__":
main()
FastAPI seems to be the mainstream to associate a database with a Python object using one of the most commonly used ORM (Object-Relation Mapping) in Python called sqlalchemy.
After writing this, go inside the docker container and start migrating.
docker-sync-stack start
Launch the container with, put it in synchronous mode,
docker container ls
Look at the container list that is up in. Then
docker exec -it {Container name} sh
Hit to enter the container. Then, migrate with the following command.
poetry run python db.py
Then, the migration will run safely and the table will be created successfully!
Then, next, I will write CRUD processing with Fast API.
With extensibility in mind, we will use the include_router function built into the Fast API to divide the file.
mkdir routers
And hit
touch routers/todo.py
Create a file called. I will write CRUD processing here.
routers/todo.py
from fastapi import Depends, APIRouter
from sqlalchemy.orm import Session
from starlette.requests import Request
from pydantic import BaseModel
from db import Todo, engine, get_db
router = APIRouter()
class TodoCreate(BaseModel):
title: str
content: str
done: bool
class TodoUpdate(BaseModel):
title: str
content: str
done: bool
@router.get("/")
def read_todos(db: Session = Depends(get_db)):
todos = db.query(Todo).all()
return todos
@router.get("/{todo_id}")
def read_todo_by_todo_id(todo_id: int, db: Session = Depends(get_db)):
todo = db.query(Todo).filter(Todo.id == todo_id).first()
return todo
@router.post("/")
def create_todo(todo: TodoCreate, db: Session = Depends(get_db)):
db_todo = Todo(title=todo.title,
content=todo.content, done=todo.done)
db.add(db_todo)
db.commit()
@router.put("/{todo_id}")
def update_todo(todo_id: int, todo: TodoUpdate, db: Session = Depends(get_db)):
db_todo = db.query(Todo).filter(Todo.id == todo_id).first()
db_todo.title = todo.title
db_todo.content = todo.content
db_todo.done = todo.done
db.commit()
@router.delete("/{todo_id}")
def delete_todo(todo_id: int, db: Session = Depends(get_db)):
db_todo = db.query(Todo).filter(Todo.id == todo_id).first()
db.delete(db_todo)
db.commit()
In this way, I roughly wrote the CRUD operation. Just write the request name after @router and the URL of the operation target.
Then edit main.py
so that you can read this.
main.py
from fastapi import FastAPI
from routers import todos
app = FastAPI()
app.include_router(
todos.router,
prefix="/todos",
tags=["todos"],
responses={404: {"description": "Not found"}},
)
prefix will create a url path for you. tags are grouped so that the docs are easy to see.
Then http://localhost:8000/docs When you connect to, it should look like this!
Open a tab and press the click button to try CRUD processing!
If this is left as it is, a CORS error will occur when calling from the front end, so please add the following CORS processing when calling from another application.
main.py
#Postscript
from starlette.middleware.cors import CORSMiddleware
app.add_middleware(
CORSMiddleware,
allow_origins=["*"],
allow_credentials=True,
allow_methods=["*"],
allow_headers=["*"],
)
How was FastAPI? It is very attractive to be able to create an API with such a small amount of code. It seems to be very compatible when creating microservices in Python.
Since this was Qiita's first post, please ask if you have any questions! From now on, I want to output to Qiita as much as possible ... (I will do my best)
Recommended Posts