I often see Qiita articles that use boto2 to upload files in local storage, I couldn't find the one using boto3, so I will post the one implemented by referring to the official website.
Python 3.6.1 pip 9.0.1
--Package installation --Config file settings --Implementation (python source code)
boto3(AWS SDK for python)
$ pip install boto3
Also install the package for operating AWS services from the command line.
$ pip install awscli
Write the access key, secret key, region, etc. to the Config file. If you execute the following command and then enter the data interactively, a file will be created under your home directory.
__ * In boto2, the AWS access key and secret key were read from the source, but In boto3, get the above two keys from the Config file. __ </ font>
$ aws configure
AWS Access Key ID [None]: xxxxxxxxxxxxxxxxxxxxxxx
AWS Secret Access Key [None]: xxxxxxxxxxxxxxxxxxx
Default region name [None]: xxxxxxxxxxxxxxxxxxxxx
Default output format [None]: xxxxxxxxxxxxxxxxxxx
Generated files (2)
~/.aws/credentials
----------------------------------------------
[default]
aws_access_key_id = ACCESS_KEY_ID
aws_secret_access_key = SECRET_ACCESS_KEY
----------------------------------------------
~/.aws/config
----------------------------------------------
[default]
region = [xxxxxxxxxxxxxxxxx]
output = [xxxxxxxxxxxxxxxxx]
----------------------------------------------
It is generated in the home directory, and even when boto3 is executed, the above file in the home directory is read, so Note that moving the .aws directory will result in an error.
botocore.exceptions.NoCredentialsError: Unable to locate credentials
If you move it, the above error will occur. Well, ordinary people wouldn't move it. .. ..
# -*- coding: utf-8 -*-
import sys
import threading
import boto3
# boto3 (.aws/config)
BUCKET_NAME = 'YOUR_BUCKET_NAME'
class ProgressCheck(object):
def __init__(self, filename):
self._filename = filename
self._size = int(os.path.getsize(filename))
self._seen_so_far = 0
self._lock = threading.Lock()
def __call__(self, bytes_amount):
with self._lock:
self._seen_so_far += bytes_amount
percentage = (self._seen_so_far / self._size) * 100
sys.stdout.write(
"\r%s / %s (%.2f%%)" % (
self._seen_so_far, self._size,
percentage))
sys.stdout.flush()
def UploadToS3():
# S3Connection
s3 = boto3.resource('s3')
s3.Object(BUCKET_NAME, 'OBJECT_KEY (S3)').upload_file('UPLOAD_FILE_PATH (lOCAL)')
UploadToS3()
__OBJECT_KEY (S3) __: Set the object key in S3. </ font> __UPLOAD_FILE_PATH (lOCAL) __: Set the path of the local file to upload. </ font>
If you feel like it, I would like to write a comparison with the upload method in boto2. Until the end Thank you for reading.
Please point out any mistakes or updates.
Recommended Posts