BigQuery client library google-cloud-bigquery
I will write how to use the wrapper pandas-gbq
.
google-cloud-bigquery
.
https://cloud.google.com/bigquery/docs/pandas-gbq-migration?hl=ja[Explanation with image] Register an account with a free trial of Google Cloud Platform (GCP)
Install Google Cloud SDK ~ Initialize
Create a project with Google Cloud SDK
Create a service account / service account key so that you can access GCS from Python.
gcloud iam service-accounts create service account name\
--display-name service account display name\
gcloud projects get-iam-policy mypj-id
# bindings:
# - members:
# - user:[email protected]
# role: roles/owner
# etag: BwWeTrntoao=
# version: 1
Grant BigQuery administrator privileges
gcloud projects add-iam-policy-binding project ID\
--member serviceAccount:Service account name@Project ID.iam.gserviceaccount.com \
--role roles/bigquery.admin
https://cloud.google.com/iam/docs/understanding-roles?hl=ja#predefined_roles
gcloud projects get-iam-policy mypj-id
# bindings:
# - members:
# - user:[email protected]
# role: roles/owner
# - members:
# - serviceAccount:[email protected]
# role: roles/bigquery.admin
# etag: BwWeTz6vIBY=
# version: 1
$ gcloud iam service-accounts keys create ./anata_no_key.json \
--iam-account service account name@Project ID.iam.gserviceaccount.com
.
├── anata_no_key.json
└── working/
└── main.py
Install google-cloud-bigquery
with pip
$ pip install google-cloud-bigquery
main.py
import os
from google.cloud import bigquery
from IPython.display import display
#Get the directory of your own file
cwd = os.path.dirname(os.path.abspath(__file__))
#Set in environment variable
key_path = '{}/../credentials.json'.format(cwd)
os.environ['GOOGLE_APPLICATION_CREDENTIALS'] = key_path
project_id = 'project_id'
dataset_id = 'dataset_id'
table_id = 'table_id'
client = bigquery.Client(project=project_id)
sql = f"""
select col1, col2, col3
from {dataset_id}.{table_id}
limit 100
"""
df = client.query(sql).to_dataframe()
display(df)
Install pandas-gbq
with pip
$ pip install pandas-gbq
import pandas as pd
#Load from BigQuery in DataFrame format
project_id = 'project_id'
dataset_id = 'dataset_id'
table_id = 'table_id'
query = f"""
SELECT *
FROM {dataset_id}.{table_id}
LIMIT 10
"""
# dialect='standard'Use standard SQL in
df = pd.read_gbq(query, project_id, dialect='standard')
display(df)
import pandas as pd
#Import DataFrame into BigQuery
df = pd.read_csv('path/to/dir/file.csv')
dataset_id = 'dataset_id'
table_id = 'table_id'
df.to_gbq(f'{dataset_id}.{table_id}')
Recommended Posts