--Troublesome to develop in a local environment, deploy to a Linux environment and test ――I want to create a development environment that feels good to use Docker ――But various settings and construction are troublesome
For such people, we will introduce how to use the extension "Remote-Containers" that allows you to conveniently build and operate a development environment using Docker containers from VS Code.
The VS Code extension "Remote-Containers" launches VS Code in a container and communicates with VS Code on the host machine so that you can develop in a container as if you were developing in a local environment. is.
The detailed configuration is illustrated in the official documentation.
(https://code.visualstudio.com/assets/docs/remote/containers/)
In addition, it is possible to manage multiple development environments from VS Code and launch a container with one click.
(https://code.visualstudio.com/assets/docs/remote/containers/)
Therefore, when starting development, there is no need to start the container from a command, attach a shell, and enter the container.
You can start developing in a container just as you would open VS Code in your local environment and start developing.
The system requirements are as follows.
- Windows: Docker Desktop 2.0+ on Windows 10 Pro/Enterprise. (Docker Toolbox is not supported.)
- macOS: Docker Desktop 2.0+.
- Linux: Docker CE/EE 18.06+ and Docker Compose 1.21+. (The Ubuntu snap package is not supported.)
Docker
First, install Docker, but omit it here.
I installed Docker Desktop in a Windows environment, but I experienced unimaginable hardships.
Open the extension menu with ctrl + shift + X on VS Code, search for "Remote-Containers" and install.
It's Microsoft official, so it's safe (?).
There is also an extension called Remote Development that combines Remote-Containers / Remote-SSH / Remote-WSL, so that's okay.
git
There is a sample configuration file in the official Microsoft repository on Github, so it's smooth to clone it.
So let's install git (step omitted).
The sample repository is below.
This time I will use Python, but there are also node.js, java, go, etc.
All you need is a Dockerfile and devcontainer.json under the .devcontainer directory, so you can bring just that.
There is "Try a Sample" in the function of Remote-Containers, and you can try using these repositories without cloning, but it is a little annoying because the build of docker image starts suddenly.
For example, suppose you want to create a development environment for a Python application.
Open the project directory from VS Code with the directory structure as follows.
project/
└ .devcontainer/
├ Dockerfile
└ devcontainer.json
└ .git/
└ package/
├ __init__.py
├ __main__.py
└ module.py
├ requirements.txt
└ .gitignore
Click the f1 menu or the green icon that appears at the bottom left and select "Remote-Containers: Open Folder in Container ...".
(https://code.visualstudio.com/assets/docs/remote/containers/)
Then VS Code reads the Dockerfile and devcontainer.json under .devcontainer and launches the Docker container according to the settings.
Now let's take a look at the contents of the Dockerfile and devcontainer.json to understand exactly what's happening.
Dockerfile
This is a normal Dockerfile, and although there is nothing special about it, it does a nice job of setting user privileges.
#-------------------------------------------------------------------------------------------------------------
# Copyright (c) Microsoft Corporation. All rights reserved.
# Licensed under the MIT License. See https://go.microsoft.com/fwlink/?linkid=2090316 for license information.
#-------------------------------------------------------------------------------------------------------------
FROM python:3
This item specifies the image that is the basis for building the Docker image.
If you want to specify the python version in detail, set this to python: 3.7.
ARG USERNAME=vscode
ARG USER_UID=1000
ARG USER_GID=$USER_UID
RUN apt-get update \
&& apt-get -y install --no-install-recommends apt-utils dialog 2>&1 \
&& apt-get -y install git iproute2 procps lsb-release \
~~~
&& groupadd --gid $USER_GID $USERNAME \
&& useradd -s /bin/bash --uid $USER_UID --gid $USER_GID -m $USERNAME \
&& apt-get install -y sudo \
&& echo $USERNAME ALL=\(root\) NOPASSWD:ALL > /etc/sudoers.d/$USERNAME\
&& chmod 0440 /etc/sudoers.d/$USERNAME \
Here, apt-get settings and user permissions are set.
In short, it seems that the container can be handled by a user called vscode who has the authority to sudo.
devcontainer.json
Here is the miso of this extension.
"name": "Python 3",
"context": "..",
"dockerFile": "Dockerfile",
"settings": {
"terminal.integrated.shell.linux": "/bin/bash",
"python.pythonPath": "/usr/local/bin/python",
"python.linting.enabled": true,
"python.linting.pylintEnabled": true,
"python.linting.pylintPath": "/usr/local/bin/pylint"
},
"appPort": [ 9000 ],
"postCreateCommand": "sudo pip install -r requirements.txt",
"remoteUser": "vscode",
"extensions": [
"ms-python.python"
]
}
The items written in json format here are the settings of the Remote-Containers extension.
According to the official documentation, there are many other items that can be set.
By default, the project root directory is bound to the container's / workspace,
{
"workspaceFolder": "/home/vscode",
"workspaceMount": "type=bind,source=${localWorkspaceFolder},target=/home/vscode/project"
}
If you do, the default directory that will be bound to / home / vscode and opened in VS Code will also be there.
{
"containerEnv": {
"WORKSPACE": "/home/vscode"
}
}
You can set the environment variables that can be used in the container by specifying the containerEnv item.
{
"runArgs": [
"--name=project_dev_container"
]
}
You can also directly specify the options when launching the container in the runArgs item.
Actually, VS Code reads this devcontainer.json, adds various options, docker run and launches the container.
At that time, the list of character strings specified in the runArgs item is added separated by spaces.
For more information: Developing inside a Container --devcontainer.json reference
git
If .git / is included in the project directory to be bound to the container, you can use the version control function of VS Code in the container as it is.
The version control function of VS Code is very convenient, and you can do git add, git commit, git push, etc. with GUI.
However, every time I try to communicate with a mote repository inside a container, I need to authenticate with Github.
However, it has the ability to share our VS Code and git credentials with the host machine.
Developing inside a Container - Sharing Git credentials with your container
First, save the Github user name and email address in the .gitconfig file with the host machine.
$ git config --global user.name "Your Name"
$ git config --global user.email "your.email@address"
These settings are written in .gitconfig in the user root, but it seems that VS Code will automatically copy them to the container.
Next is authentication information such as passwords, but there are two ways to set it.
If you are authenticating with id and password using https communication, save the password in git's credential helper and the settings will be synchronized with the container.
Caching your GitHub password in Git
$ git config --global credential.helper wincred
For authentication with ssh, it seems that the settings will be synchronized if you register the public key for Github in the SSH agent on the host machine.
With PowerShell
ssh-add $HOME/.ssh/github_rsa
Then the key will be registered in the SSH agent.
However, there are many cases where the SSH agent is not running, in which case you can enter PowerShell with administrator privileges.
Set-Service ssh-agent -StartupType Automatic
Start-Service ssh-agent
Get-Service ssh-agent
Or you can set it in the GUI from Services> OpenSSH Authentication Agent Properties.
For more information, see [Using Windows 10 ssh-agent with Command Prompt, WSL, Git Bash to enable # ssh-agent-](https://qiita.com/q1701/items/3cdc2d7ef7a3539f351d#ssh-agent-% E3% 81% AE% E6% 9C% 89% E5% 8A% B9% E5% 8C% 96) etc.
AWS Access Key
When I try to communicate with AWS S3 from within the container, I'm having access key issues.
I have the access key information in the .aws directory in the user root of the host machine and I want to read it in the container as well.
However, unlike the case of git, it does not seem to load automatically.
Therefore, you need to copy it from outside the container to the container once using docker cp.
docker cp $HOME/.aws {Container name}:home/vscode
Here, it is convenient to have a name for the container launched by Remote-Containers.
In the runArgs item of devcontainer.json earlier,
{
"runArgs": [
"--name=project_dev_container"
]
}
It is a good idea to give the container a name like this.
When you actually use it, you can handle the development container in almost the same way as you would normally develop with VS Code.
Also, when developing with Python, it is normal to prepare a virtual environment, but by building the environment inside the container, that is not necessary.
This is because each project uses a different container, so installing the package globally does not pollute the environment.
And since the image is prepared for each version of Python, you can specify it according to the production environment.
(This area is a merit of using docker rather than Remote-Containers)
You can open and switch between these development environments with just one click.
The extension also automatically binds ports, so you can use your local server for front-end development without stress.
However, for example, node.js will start a local server by specifying port 3000, so that port must be published to the local machine.
In that case, in devcontainer.json
{
"appPort": [ 3000 ]
}
Let's set it like this.
Official documentation, long
After seeing this article, I started using Remote-Containers. Thanks!
Recommended Posts