A script that uses boto to upload a specified folder to Amason S3

* 2015-01-07 postscript

I was told that if you use [ʻawscli](https://pypi.python.org/pypi/awscli) instead of boto, you don't have to write an s3 bucket upload script as shown below. ʻAwscli It seems to be convenient, so please use this.

Installation $ pip install awscli

Register S3 information (a configuration file is generated in ~ / .aws /.) $ aws configure

Upload specified folder $ aws s3 sync <folder path> s3: // <bucket name>

Confirm upload $ aws s3 ls <bucket name>

You can also download $ aws s3 sync s3: // <bucket name> <download destination path>

Script to upload specified folder to Amazon S3

I wrote in Qiita what I posted on Gist.

Use boto to upload the specified folder (all the files inside) to the bucket of Amason S3.

Let's install boto with pip install boto etc. Add ACCESS_KEY_ID, SECRET_ACCESS_KEY, BUCKET_NAME. In the example, the header is written as'Cache-Control: max-age = 10', but you can also add the header if you like.

deploys3.py


# -*- coding: utf-8 -*-
"""deploy script to upload the files to AWS S3 bucket
 
Usage:
    $ python deploy_s3.py <folder name for deploy>
"""
 
import os
import sys
from boto.s3.connection import S3Connection
from boto.s3.key import Key
 
 
ACCESS_KEY_ID = 'xxx'
SECRET_ACCESS_KEY = 'xxx'
BUCKET_NAME = 'xxx'


def main():
    # check arguments
    if len(sys.argv) is not 2:
        print '[ERROR] wrong number of arguments. (required 1, got %s)' % len(sys.argv)
        sys.exit(1)
    _file_name = str(sys.argv[1])

    # upload to S3
    try:
        upload_to_s3(_file_name)
    except Exception, e:
        raise e
        print '[ERROR] upload to S3 has been failed.'
    print '[OK] upload to S3 bucket has successfully completed. :)'
 
 
def upload_to_s3(file_name):
    # connect to S3
    s3 = S3Connection(ACCESS_KEY_ID, SECRET_ACCESS_KEY)
    bucket = s3.get_bucket(BUCKET_NAME)
    # upload with metadata and publish
    fc = 0
    for abspath, relpath in upload_files(file_name):
        k = Key(bucket)
        k.key = relpath
        k.set_metadata('Cache-Control', 'max-age=10')
        k.set_contents_from_filename(abspath)
        k.make_public()
        fc += 1
    print '[OK] %s files are uploaded.' % fc
 
 
def upload_files(basedir):
    parent_dir = os.path.dirname(os.path.realpath(basedir))
    for (path, dirs, files) in os.walk(basedir):
        for fn in files:
            if fn.startswith('.'):
                continue
            abspath = os.path.join(path, fn)
            yield (
                abspath,
                os.path.relpath(abspath, parent_dir).split(''.join([basedir, '/']))[1]
            )
 
 
if __name__ == '__main__':
    main()

If you execute this script as follows, all files except invisible files (files starting with.) In the hoge folder will be uploaded and published in the root directory of S3 with the specified header.

$ python deploys3.py hoge

I'm not sure how to use Qiita posts ... Please let me know if you have gem or npm with features like boto m () m

Recommended Posts

A script that uses boto to upload a specified folder to Amason S3
A script that morphologically parses a specified URL
Use boto to upload / download files to s3.
I wrote a script to upload a WordPress plugin
How to write a test for processing that uses BigQuery
A python script that converts Oracle Database data to csv
Upload a file to Dropbox
To make sure that the specified key is in the specified bucket in Boto 3
A script that returns 0, 1 attached to the first Python prime number
Use boto3 to mess with S3