When I wrote a Lambda function that registers data from Kinesis in BigQuery in Python, uploaded it, and then flowed the process, the following error occurred.
No module named google.protobuf: ImportError
Traceback (most recent call last):
File "/var/task/main.py", line 17, in kinesis_event_bigquery_handler
insert_records(records)
File "/var/task/main.py", line 29, in insert_records
from gcloud import bigquery
File "/var/task/gcloud/bigquery/__init__.py", line 24, in <module>
from gcloud.bigquery.client import Client
File "/var/task/gcloud/bigquery/client.py", line 18, in <module>
from gcloud.client import JSONClient
File "/var/task/gcloud/client.py", line 20, in <module>
from gcloud._helpers import _determine_default_project
File "/var/task/gcloud/_helpers.py", line 26, in <module>
from google.protobuf import timestamp_pb2
ImportError: No module named google.protobuf
Add an empty file __init__.py
to your google package. If you are building an environment with virtualenv:
$ touch lib/python2.7/site-packages/google/__init__.py
If you are using lambda-uploader, don't forget --virtualenv
when uploading.
$ lambda-uploader --virtualenv=.
https://github.com/awslabs/kinesis-deaggregation/blob/master/python/README.md
Recommended Posts