You may want to export the data file in EC2 to another environment while verifying something in the AWS environment. (I have quite a few, but what about everyone ...) If EC2 is a Windows server OS, you can use a browser, so I think there are various ways to do it, such as uploading it to Box etc. or attaching it to Web mail, but what to do with Linux OS (´ ・ ω ・ `). Therefore, I would like to verify the method of uploading directly from EC2 to S3 using the AWS CLI and downloading to another environment from the console screen of S3.
I want to export the data file in EC2 (Linux) to S3 so that I can download it from the S3 console screen.
First, deploy EC2 and log in with SSH. Then create a directory and create a test file to copy to S3.
Switch user to #root sudo su root #Upload information yum -y update Create a directory (/ test-share) in # / home cd /home mkdir /test-share cd /test-share #Create a test file (test.txt) touch test.txt #Check if the file was created ls
Now you are ready to copy the test file to S3.
I have deployed EC2, but in the current state, I do not have the authority (role) to access S3, so I need to attach a role.
How to make a roll Access IAM from the AWS console screen and click "Role"
Then click "Create Role"
Select "EC2" for common use cases
Select a policy to assign to a role Enter S3 in the policy filter and in the displayed policies This time, I want to access (upload, download, browse) S3, so select [AmzzonS3FullAccess].
Tagging as needed (tags are not set this time)
Set the role name arbitrarily and click "Create"
Once you have created a role, attach it to EC2 Select the target instance on the EC2 console screen, and select "Action"-"Instance Settings"-"IAM Role Assignment / Replace".
Assign the target instance ID and role, and click "Apply".
You have now attached a role for EC2 to access S3.
Next, create an S3 bucket (test-20200624-folder) to copy the data to. Access the console screen of S3 and "Create a bucket"
Enter "test-20200624-folder" and the region (this time Asia Pacific) in the bucket name, and click "Next".
I will set the bucket, but this time there is no particular change in the setting, so "Next" as it is
Make sure that Block Public Access is checked and click "Next"
Once you've attached the role, enter the command to copy the data to S3 in the AWS CLI.
#Log in to EC2 again with SSH and switch to root sudo su root #Move to the directory where the test file is cd /home /test-share Copy data to S3 with #aws cli aws s3 cp /test.txt s3://test-20200624-folder
Finally, check if you can download "test.txt" from the S3 console screen. When you access the S3 console screen and check the target bucket
I was able to confirm that test.txt was copied! After that, if you select "Download" from "Action", the file export to another environment is completed.
There are articles with the same content in various places, but there was no explanation to attach the EC2 role, and even if I followed the procedure, an error occurred, so I hope that it will be useful for memorandums and beginners.
You can use the AWS CLI to not only copy data from EC2 to S3, but also synchronize a specific directory of EC2 with a folder in the S3 bucket. It would be convenient if EC2 had a root CA function and could be used to export the created certificate or send a log file for verification. D (`・ ω ・')
Recommended Posts