CodeCommit + CodePipeline with CDK

Introduction

This post introduces the mechanism for uploading files to S3 with CodeCommit + CodePipeline, which I tried to make a prototype while studying AWS CDK. In the engineer team, there was a voice saying "I want to touch it," and I was interested in it, but I heard a reputation from engineers in another department saying "CDK is good", so I touched it.

What is AWS CDK?

What is AWS CDK? ʻThe AWS Cloud Development Kit (AWS CDK) models and provisions cloud application resources using a familiar programming language. It's an open source software development framework for provisioning. It is convenient to define the infrastructure with Python or TypeScript.

background

The earliest

In the past, Cloud9 was used as an environment for production, but the selection requirements at that time are

  1. Can be edited at the same time
  2. You can preview the artifacts
  3. Basic authentication is included in the preview
  4. Version control is possible

So, I made a thing that html files and image files can be uploaded from Cloud9 to S3 via CodeCommit with Cloud9 + Codecommit + Lambda + CloudFront + Lambda @ edge + S3 and can be previewed via CloudFront. At first it seemed to turn around well.

problem

Cloud9, which initially seemed to work well, gradually became a problem.

  1. Some files are not uploaded to S3. (This is suspicious that CodeCommit → S3 is done with Lambda)
  2. The number of projects is increasing and the capacity of Cloud9 is being squeezed (because it is a product of image morimori ...)
  3. There is a sign that the simultaneous editing function is not being used in the first place ...

So there was no need for Cloud9. If so, the momentum for remaking has increased by saying that it would be better to make it locally and go up to S3. (Since the members who actually code are not the people who touch AWS, I also wanted to provide a form that does not directly face AWS as much as possible.)

Main subject

For the remake, we adopted the CDK that we heard rumors about in the company. CDK is in charge of resource creation for CodeCommit, CodePipeline, and CodeBuild on the left side of the figure below. (S3 + CloudFront + Lambda @ edge is the same as the previous one.)

codecommit_diagram.png

In the Cloud9 era, what was 1Cloud9 (1EC2 instance) = multiple projects = 1 repository was changed to 1 project = 1 repository. By cd k deploy at the start of the project, the repository is created and CodePipeline is prepared on CodeCommit, and every time you push to master, the file is uploaded to S3 by CodeBuild.

Implementation

According to Documentation

$ npm install -g aws-cdk
$ mkdir test
$ cd test
$ cdk init test --language python
$ source .env/bin/activate
$ pip install -r requirements.txt

Start as.

app.py


#!/usr/bin/env python3
import os
from os.path import join, dirname
from dotenv import load_dotenv

from aws_cdk import core
from test.test_stack import TestStack


dotenv_path = join(dirname(__file__), '.env_file')
load_dotenv(dotenv_path)


app = core.App()
TestStack(app,
          "test",
          repo_name=os.environ["REPOSITORY_NAME"],
          env={"account": "xxxxxxxxxxxx", "region": "ap-northeast-1"})
app.synth()

test_stack.py


from aws_cdk import core
from aws_cdk import aws_codecommit as codecommit
from aws_cdk import aws_codebuild as codebuild
from aws_cdk import aws_codepipeline as codepipeline
from aws_cdk import aws_codepipeline_actions as codepipeline_actions
from aws_cdk import aws_iam as iam

class TestStack(core.Stack):

    def __init__(self, scope: core.Construct, id: str, repo_name: str, **kwargs) -> None:
        super().__init__(scope, id, **kwargs)

        #Bucket name of output destination
        s3_bucket_name = "test-bucket"
    
        #CodeCommit Repository creation
        repo = codecommit.Repository(self,
                                     "Repository",
                                     repository_name=repo_name,
                                     description="test.")
        repository = codecommit.Repository.from_repository_arn(self, repo_name, repo.repository_arn)

        #Code Pipeline definition
        pipeline = codepipeline.Pipeline(self,
                                         id=f"test-pipeline-{repo_name}",
                                         pipeline_name=f"test-pipeline-{repo_name}")
        source_output = codepipeline.Artifact('source_output')

        #Add CodeCommit
        source_action =  codepipeline_actions.CodeCommitSourceAction(repository=repository,
                                                                     branch='master',
                                                                     action_name='source_collect_action_from_codecommit',
                                                                     output=source_output,
                                                                     trigger=codepipeline_actions.CodeCommitTrigger.EVENTS)
        pipeline.add_stage(stage_name='Source', actions=[source_action])

        #Add CodeBuild
        cdk_build = codebuild.PipelineProject(self,
                                              "CdkBuild",
                                              build_spec=codebuild.BuildSpec.from_object(dict(
                                                  version="0.2",
                                                  phases=dict(
                                                      build=dict(
                                                          commands=[f"aws s3 sync ./ s3://{s3_bucket_name}/"]
                                                      )
                                                  )
                                              )))
        cdk_build.add_to_role_policy(
            iam.PolicyStatement(
                resources=[f'arn:aws:s3:::{s3_bucket_name}', f'arn:aws:s3:::{s3_bucket_name}/*'],
                actions=['s3:*']
            )
        )

        build_output = codepipeline.Artifact("CdkBuildOutput")

        build_action = codepipeline_actions.CodeBuildAction(
                            action_name="CDK_Build",
                            project=cdk_build,
                            input=source_output,
                            outputs=[build_output])

        pipeline.add_stage(
            stage_name="Build",
            actions=[build_action]
        )

CodeBuild When I think about it now, I feel that CodeDeploy was good for CodeBuild, which is just s3 sync. I want to make the hierarchy on the s3 bucket side one step deeper when doing s3 sync, but is there a good solution ...

codebuild part


#Add CodeBuild
cdk_build = codebuild.PipelineProject(self,
                                      "CdkBuild",
                                      build_spec=codebuild.BuildSpec.from_object(dict(
                                        version="0.2",
                                        phases=dict(
                                        build=dict(commands=[f"aws s3 sync ./ s3://{s3_bucket_name}/"])
                                      ))))

Deploy

later

$ cdk deploy

You can deploy it by doing, but I made a script as follows for the user.

make.sh


#!/bin/bash

if [ $# -ne 1 ]; then
  echo "It requires one argument to execute." 1>&2
  echo "Specify the project code in the argument. (Partially used for the URL at the time of preview)"
  exit 1
fi
echo REPOSITORY_NAME=$1 > .env_file
cdk deploy test --profile xxxxxxx
#commit
git clone ssh://git-codecommit.ap-northeast-1.amazonaws.com/v1/repos/$1 ../$1
mkdir ../$1/$1
touch ../$1/$1/.gitkeep
cd ../$1
git checkout -b master
git add -A
git commit -m "initial commit"
git push origin master

Now, take the repository name to be created as an argument

$ ./make.sh test_repository

If so, it's OK.

Concern

――Since we are creating a CodeCommit repository in the CDK now, there is a risk that if you delete the stack, the entire repository will be deleted. --Actually, the master branch is not created on the CodeCommit side at the first deploy (?), So it fails at Build. If you git push it, it will work properly, so I have left it for now. (I feel a little uncomfortable.)

Finally

This time, I created a mechanism to upload files from CodeCommit to S3 via CodeBuild using CDK. It seems to have a better reputation than the Cloud9 era, so I'm waiting for feedback while thinking that it would be nice if you could use it as it is. I think the CDK itself is good because you don't have to fight CloudFormation directly.

Recommended Posts

CodeCommit + CodePipeline with CDK
AWS CDK with Python
[AWS] Build an ECR with AWS CDK
Building an AWS Fargate Service with AWS CDK