End users want to upload files from their on-premises environment to a GCS bucket, but that's all the information and it's hard to hear directly. The GCP project is built without an organization, but think about it without being influenced by such factors as whether you are using Google Workspace, have a Google account, what your on-premises environment is like, and so on. It was.
I decided to issue a service account in my GCP project, give it the role "Storage Object Creator", create a service account key, and then upload the file from the Docker container in the end user's on-premises environment.
Create a storage class STANDARD, Tokyo region, uniform bucket.
export PROJECT_ID=
export BUCKET_ID=
export DEV_BUCKET_ID=$BUCKET_ID-dev
# create bucket
gsutil mb -p $PROJECT_ID -c STANDARD -l ASIA-NORTHEAST1 -b on gs://$DEV_BUCKET_ID
# check bucket's iam
gsutil iam get gs://$DEV_BUCKET_ID
Buckets that handle sensitive information may inherit the permissions set at the project level (because the gsutil iam get
command cannot check the permissions inherited from the project level) on the Console. Check permissions. By default, basic roles (Owner, Editor, Viewer) can also access the bucket, but delete them if you don't need them. However, if the owner is deleted and there is no member with the role equivalent to Storage Admin, the bucket will be inoperable.
Create a service account, key for use in your on-premises environment, and grant the service account the role of Storage Object Creator at the bucket level.
export PROJECT_ID=
export SER_ACC_ID=
export DEV_SER_ACC_ID=$SER_ACC_ID-dev
# create service account
gcloud iam service-accounts create $DEV_SER_ACC_ID --project $PROJECT_ID \
--description "" --display-name $DEV_SER_ACC_ID
# create key file for service account
gcloud iam service-accounts keys create {KEY_FILE_PATH} \
--project $PROJECT_ID --iam-account $DEV_SER_ACC_ID@$PROJECT_ID.iam.gserviceaccount.com
# grant objectCreator role to service account
gsutil iam ch serviceAccount:$DEV_SER_ACC_ID@$PROJECT_ID.iam.gserviceaccount.com:objectCreator \
gs://$DEV_BUCKET_ID
After doing Install Google Cloud SDK, you can use gsutil
, but I dare to use Google Cloud SDK Docker. First, the file layout is here.
├── Dockerfile
├── certs
│ └── service-account-keyfile.json
├── csv
│ └── testdata.csv
└── docker-entrypoint.sh
Dockerfile
FROM google/cloud-sdk:slim
COPY ./docker-entrypoint.sh /
ENTRYPOINT ["/docker-entrypoint.sh"]
docker-entrypoint.sh
#!/bin/bash
set -e
usage() {
echo "Usage: $0 [-f upload csv file path] [-b gcs bucket name]" 1>&2
exit 1
}
while getopts f:b:h OPT
do
case $OPT in
f) FILE_PATH=$OPTARG
;;
b) BUCKET_NAME=$OPTARG
;;
h) usage
;;
esac
done
if [ ! -f "$FILE_PATH" ]; then
echo "$FILE_PATH does not exist."
exit 0
fi
# service account login
gcloud auth activate-service-account --key-file=/certs/service-account-keyfile.json
# upload file to GCS
gsutil cp $FILE_PATH gs://$BUCKET_NAME
You can upload the file to your GCS bucket by running the command below.
export BUCKET_ID=
export DEV_BUCKET_ID=$BUCKET_ID-dev
# create docker image
docker build -t uploader-gcloud .
# upload testdata.csv
docker run --rm \
--mount type=bind,source=$PWD/certs,target=/certs \
--mount type=bind,source=$PWD/csv,target=/csv \
uploader-gcloud -f /csv/testdata.csv -b $DEV_BUCKET_ID
In the first place, it may be a high hurdle to put Docker in the end user environment. But I thought this was a story I wouldn't know without a challenge, and in the end, the end user created a Google account and gave them the role "Storage Object Creator" + permissions "storage.buckets.list, storage.objects.list". In the form, I will operate it on the Console every day. (What is the labor shortage?)
However, for me as a Docker beginner, I realized that Google Cloud SDK Docker is a very useful tool for checking permissions.