GoogleColab I made it a notebook, so if
https://colab.research.google.com/drive/1s6_o-nvMmHhAeBAOoRDgE0UNU6Fe3XrC
It has almost the same contents as notebook. If you are having trouble opening Colab, please see here.
[Explanation with image] Register an account with a free trial of Google Cloud Platform (GCP)
Install Google Cloud SDK ~ Initialize
Authenticate with your Google account to mess with GCP using the gcloud command.
$ gcloud auth login
ID is prohibited from fogging.
$ PROJECT_ID=anata-no-pj-id
$ PROJECT_NAME=anata-no-pj-name
$ gcloud projects create $PROJECT_ID \
--name $PROJECT_NAME
If you do not set it, a 403 error will occur when accessing the bucket.
If the following pop-up does not appear, you can skip it because the billing account has already been set.
Set to the target project of command operation.
$ gcloud config set project $PROJECT_ID
! gcloud config list
# [component_manager]
# disable_update_check = True
# [compute]
# gce_metadata_read_timeout_sec = 0
# [core]
# account = [email protected]
# project = anata-no-pj-id
#
# Your active configuration is: [default]
If it is OK
REGION=us-central1
ZONE=us-central1-a
$ gcloud config set compute/region $REGION
$ gcloud config set compute/zone $ZONE
$ gcloud config set ml_engine/local_python $(which python3)
The regions where AI Platform online prediction can be used are as follows:
The interpreter is specified to use python3 system for local training.
$ gcloud config list
# [component_manager]
# disable_update_check = True
# [compute]
# gce_metadata_read_timeout_sec = 0
# region = us-central1
# zone = us-central1-a
# [core]
# account = [email protected]
# project = anata-no-pj-id
# [ml_engine]
# local_python = /usr/bin/python3
#
# Your active configuration is: [default]
If it is OK
https://github.com/komiyakomiyakomiya/titanic_prediction_on_gcp
$ git clone https://github.com/komiyakomiyakomiya/titanic_prediction_on_gcp.git
notebook
import os
os.makedirs('./titanic_prediction_on_gcp/working/models/', exist_ok=True)
The trained model is saved as ./titanic_prediction_on_gcp/working/models/model.pkl
.
$ gcloud ai-platform local train \
--package-path titanic_prediction_on_gcp/working/ \
--module-name working.predict_xgb
Create a bucket in GCS to upload the saved model.
BUCKET_NAME=anata-no-bkt-name
$ gsutil mb -l $REGION gs://$BUCKET_NAME
$ gsutil ls -la
# gs://anata-no-bkt-name/
$ gsutil cp ./titanic_prediction_on_gcp/working/models/model.pkl gs://$BUCKET_NAME/models/model.pkl
$ gsutil ls gs://$BUCKET_NAME/models/
# gs://anata-no-bkt-name/models/model.pkl
Enable the following two to use the AI-Platform API.
$ gcloud services enable ml.googleapis.com
$ gcloud services enable compute.googleapis.com
$ gcloud services list --enabled
# NAME TITLE
# bigquery.googleapis.com BigQuery API
# bigquerystorage.googleapis.com BigQuery Storage API
# cloudapis.googleapis.com Google Cloud APIs
# clouddebugger.googleapis.com Stackdriver Debugger API
# cloudtrace.googleapis.com Stackdriver Trace API
# compute.googleapis.com Compute Engine API
# datastore.googleapis.com Cloud Datastore API
# logging.googleapis.com Stackdriver Logging API
# ml.googleapis.com AI Platform Training & Prediction API
# monitoring.googleapis.com Stackdriver Monitoring API
# oslogin.googleapis.com Cloud OS Login API
# servicemanagement.googleapis.com Service Management API
# serviceusage.googleapis.com Service Usage API
# sql-component.googleapis.com Cloud SQL
# storage-api.googleapis.com Google Cloud Storage JSON API
# storage-component.googleapis.com Cloud Storage
OK if there is
Create a model resource and a version resource, and associate it with the uploaded model.pkl.
Model resource
MODEL_NAME=model_xgb
MODEL_VERSION=v1
$ gcloud ai-platform models create $MODEL_NAME \
--regions $REGION
Version resource
! gcloud ai-platform versions create $MODEL_VERSION \
--model $MODEL_NAME \
--origin gs://$BUCKET_NAME/models/ \
--runtime-version 1.14 \
--framework xgboost \
--python-version 3.5
Let's make a prediction using the data prepared in advance. First, check the contents.
! cat titanic_prediction_on_gcp/input/titanic/predict.json
# [36.0, 0] <-36 years old,male
In the format [Age, Gender], the gender is male: 0, female: 1.
! gcloud ai-platform predict \
--model model_xgb \
--version $MODEL_VERSION \
--json-instances titanic_prediction_on_gcp/input/titanic/predict.json
[0.44441232085227966] It is OK if such a predicted value is returned.
Next, access AI-Platform from python and get the prediction. You will need a service account key, so first create a service account.
SA_NAME=anata-no-sa-name
SA_DISPLAY_NAME=anata-no-sa-display-name
$ gcloud iam service-accounts create $SA_NAME \
--display-name $SA_DISPLAY_NAME \
$ gcloud projects add-iam-policy-binding $PROJECT_ID \
--member serviceAccount:[email protected] \
--role roles/iam.serviceAccountKeyAdmin
$ gcloud projects add-iam-policy-binding $PROJECT_ID \
--member serviceAccount:[email protected] \
--role roles/ml.admin
$ gcloud iam service-accounts keys create titanic_prediction_on_gcp/service_account_keys/key.json \
--iam-account [email protected]
Generate .env file and describe environment variables and paths
$ echo GOOGLE_APPLICATION_CREDENTIALS=/content/titanic_prediction_on_gcp/service_account_keys/key.json > /content/titanic_prediction_on_gcp/.env
$ cat ./titanic_prediction_on_gcp/.env
# GOOGLE_APPLICATION_CREDENTIALS=/content/titanic_prediction_on_gcp/service_account_keys/key.json
$ pip install python-dotenv
notebook
import googleapiclient.discovery
from dotenv import load_dotenv
#Environment variable settings
load_dotenv('/content/titanic_prediction_on_gcp/.env')
def main(input_data):
input_data = [input_data]
PROJECT_ID = 'anata-no-pj-id'
VERSION_NAME = 'v1'
MODEL_NAME = 'model_xgb'
service = googleapiclient.discovery.build('ml', 'v1')
name = 'projects/{}/models/{}'.format(PROJECT_ID, MODEL_NAME)
name += '/versions/{}'.format(VERSION_NAME)
response = service.projects().predict(
name=name,
body={'instances': input_data}
).execute()
if 'error' in response:
print(response['error'])
else:
pred = response['predictions'][0]
return pred
notebook
import ipywidgets as widgets
from ipywidgets import HBox, VBox
age = [i for i in range(101)]
sex = ['male', 'Female']
dropdown_age = widgets.Dropdown(options=age, description='age: ')
dropdown_sex = widgets.Dropdown(options=sex, description='sex: ')
variables = VBox(children=[dropdown_age, dropdown_sex])
VBox(children=[variables])
notebook
import numpy as np
from IPython.display import Image
from IPython.display import display_png
input_age = float(dropdown_age.value)
input_sex = 0 if dropdown_sex.value == 'male' else 1
test_input = [input_age, input_sex]
pred = main(test_input)
# print(pred)
pred_binary = np.where(pred > 0.5, 1, 0)
# print(pred_binary)
print('\n When you ride a Titanic...')
if pred_binary == 1:
display_png(Image('/content/titanic_prediction_on_gcp/images/alive.png'))
else:
display_png(Image('/content/titanic_prediction_on_gcp/images/dead.png'))
https://cloud.google.com/sdk/gcloud/reference/
https://cloud.google.com/sdk/gcloud/reference/ai-platform/
https://cloud.google.com/storage/docs/gsutil
Thank you for reading to the end. As a 36-year-old uncle, I'm about to die, so I'll be very careful when riding the Titanic.
Recommended Posts