Use of Google Cloud Storage (GCS) with "GAE / Py"

The description is based on the assumption that the file is uploaded from the client in the application running on "GAE / Python". It does not read / write data files created on the server side (data backup, etc.). It is intended to add the ability for clients to attach and download files.

The description is at a level that you can understand later, so the detailed explanation is broken. Also, there may be some misunderstandings, so it would be helpful if you could point out in that case.

[Reference: Refer to the following article] -Sinmetal's article

Overview of operation

upload

  1. The URL acquisition API used when uploading is hit (API that creates the URL)
  2. Notify the client of the URL in the response of "1."
  3. From the client side, a request including a file is POSTed to the URL notified in "2."
  4. When the upload is completed, GAE (*) will be notified of the request (POST) including the information of the file uploaded from GCS (Google Cloud Storage: hereafter GCS).
  1. If necessary, create an additional information model to save the uploaded file information and register it in the Data Store.

download

  1. Receive a download request from the client
  2. Transfer the file in response

What to implement

  1. Upload URL acquisition API
    Get the POST destination URL when uploading a file --Request handler after upload
    Information acquisition and registration of uploaded files --Download handler --Linking to handler

1. Upload URL acquisition API

To issue the URL for uploading to Google CloudStrage, use "create_upload_url" as in the handling of blobstore, but by adding "gs_bucket_name =" as an option, you can specify the bucket created on GCS. It will be possible.

The following implementation is implemented in endpoint format (I assume that API for iOS is output). Extract only the parts that use GCS. The description about endpoint is omitted.

ep_upload.py


#Unique class for endpoints
import GetUploadUrlResponse

from google.appengine.ext import blobstore
・
・
class GetUploadUrl():
	def __init__(self, request):
        self.response = None

	def done(self):
		# create_upload_In url, change the GCS bucket name to "gs"_bucket_Specify in "name"
		url = blobstore.create_upload_url('/uploaded',gs_bucket_name = 'bucket_name/path/')
		return self.response = GetUploadUrlResponse(ret = 0,
												   	url = url)
・
・

2. Request handler after upload

When the posting of the file to the issued URL is completed and the upload to GCS is completed, the request including the file information is notified to the address specified in "create_upload_url". Receive the notification and register it in the Data Store with Model etc. that manages the uploaded file.

There are the following two types of acquisition methods inherited from blobstore_handlers.BlobstoreUploadHandler to acquire the uploaded file from the request.

Both are the same in terms of the information of the uploaded file, but note that the formats they handle are completely different.

When using get_upload: BlobInfo When using get_file_infos: FileInfo

In my implementation (environment?), If "get_file_infos" is used, it will not be possible to download (an error will occur). I used "get_upload".

upload.py


import webapp
from google.appengine.ext import blobstore
from google.appengine.ext.webapp import blobstore_handlers
from models import GsFile
・
・
class UploadHandler(blobstore_handlers.BlobstoreUploadHandler,webapp.RequestHandler):
	def post(self):
		blob_infos = self.get_uploads('file')
       
		if not isinstance(blob_infos, list):
			blob_infos = [blob_infos]

		for blob_info in blob_infos:	
			GsFile.create(	filename		= blob_info.filename
							content_type	= blob_info.content_type
							size			= blob_info.size
							key				= str(blob_info.key()))
							
		self.response.write('0')
・
・

3. Download handler

The client refers to the management model (GsFile in this example) registered when the upload is completed, and Make a download request with the key information.

upload.py


・
・
class ContentDownload(blobstore_handlers.BlobstoreDownloadHandler):
	def get(self, key):
		key = str(urllib.unquote(key))
		blob_info = blobstore.BlobInfo.get(key)
		self.send_blob(blob_info)
・
・

When downloading by link from the browser, if you call it as it is, the file name cannot be obtained, so it will be saved with the name of the URL part. If you want to implement direct download from the browser, set the "send_blob" option (save_as).

self.send_blob(blob_info, save_as=True)

4. Linking to the handler

Set the class corresponding to each request (Download allows you to enter the key on the URL)

upload.py


・
・
APPLICATION = webapp.WSGIApplication([('/upload', UploadHandler),
                                      ('/download/([^/]+)?', DownloadHandler)])

app.yaml


- url: /(upload|download/.*)
  script: upload.APPLICATION

Recommended Posts

Use of Google Cloud Storage (GCS) with "GAE / Py"
A story linked with Google Cloud Storage with a little ingenuity
Speech recognition of wav files with Google Cloud Speech API Beta
Upload file to GCP's Cloud Storage (GCS) ~ Load with local Python
Copy data from Amazon S3 to Google Cloud Storage with Python (boto)
Streaming speech recognition with Google Cloud Speech API
Use Tabpy with Cloud Run (on GKE)
Try using Python with Google Cloud Functions
Use Google Cloud Vision API from Python
Use TPU and Keras with Google Colaboratory
GAE --With Python, rotate the image based on the rotation information of EXIF and upload it to Cloud Storage.
Firebase: Use Cloud Firestore and Cloud Storage from Python
Get Google Cloud Storage object list in Java
Use multiple versions of python environment with pyenv
How to use the Google Cloud Translation API
Story of trying to use tensorboard with pytorch
Precautions when using google-cloud library with GAE / py
GUI creation with Pyside Part 2 <Use of class>
[Google Cloud Platform] Use Google Cloud API using API Client Library
Use Python / Django with Windows Azure Cloud Service!