When accessing Google Cloud Storage from boto, it is possible to access with the access key and secret key by enabling compatible operation access with s3. http://qiita.com/itkr/items/d990e87a2540332ee0e5
However, there is one problem with this method. Since the access key that can be issued with compatible operation access is linked to the Google user account, it is inconvenient for projects with multiple people. Then you will have the option of accessing with a service account and key file (p12).
Here we use a library called gcs-oauth2-boto-plugin.
pip install gcs-oauth2-boto-plugin
Prepare a configuration file that describes the service account and key file path.
By default, it looks at ~ / .boto
, but you can specify any path by preparing either of the following in the environment variable. By the way, this is the same as gsutil.
--BOTO_CONFIG (specify a single file)
--BOTO_PATH (specify multiple files separated by :
)
When writing environment variables in python, write as follows.
import os
os.environ['BOTO_CONFIG'] = '/path.to/boto_config'
You can write various settings, but the important ones this time are the following settings
[Credentials]
gs_service_key_file = /path.to/sample-KEYFILE.p12
gs_service_client_id = [email protected]
[GSUtil]
default_project_id = sampleproject-994
Reference: https://cloud.google.com/storage/docs/gsutil/commands/config#additional-configuration-controllable-features
import boto
bucket_name = 'bucket_name'
uri = boto.storage_uri(bucket_name, 'gs')
print uri.get_bucket()
Basically, this is all you need if you have gcs-oauth2-boto-plugin installed.
boto provides a mechanism that allows you to install a Plugin when pasting an AuthConnection. I will omit it because I will reprint most of the implementation of boto if I write it in detail, but if you look at the code related to auth of boto, you can find the call of the function get_plugin
.
boto/plugin.py
def get_plugin(cls, requested_capability=None):
if not requested_capability:
requested_capability = []
result = []
for handler in cls.__subclasses__():
if handler.is_capable(requested_capability):
result.append(handler)
return result
__subclasses__
displays a list of subclasses, but in this case it is a list of classes that inherit from the ʻAuthHandler class. Since the class ʻOAuth2ServiceAccountAuth
in gcs-oauth2-boto-plugin inherits from ʻAuthHandler, it is recognized as a plugin. In it, necessary information such as
gs_service_key_file` is obtained from the configuration file.
You can also implement your own class that inherits ʻAuthHandler` and forcibly overwrite the path of the config file in it to change it dynamically. But I think it can probably be smarter.
from boto.auth_handler import AuthHandler
from boto.pyami.config import Config
boto_path = ''
class SpamAuth(AuthHandler):
def __init__(self, path, config, provider):
config = Config(path=boto_path)
# ...Abbreviation...
def spam(path='/path.to/boto_config'):
global boto_path
boto_path = path
bucket_name = 'bucket_name'
uri = boto.storage_uri(bucket_name, 'gs')
print uri.get_bucket()
At this time, the URI class (BucketStorageUri
in this case) keeps the connection in the dict called provider_pool
once the connection is established, so when switching the connection, del BucketStorageUri.provider_pool ['gs']
And BucketStorageUri.provider_pool = {}
It seems that you have to clear it.
Recommended Posts