The content is almost the same as Article on the 2nd day, but after a while, it turns out that the article on the 2nd day cannot be used from cloud functions. So, I would like to tell you about the problems and how to deal with them.
The one on the second day was fine for running locally. However, the cloud function is read only and I can't create a csv file. So you have to either export it directly to BigQuery or upload it to GCS without creating a csv file. From the results, I chose the latter.
There was an article saying that you can create a file in the / tmp / directory, but I tried it and it didn't work, so I gave up on the way.
It seems that you have to prepare two variables for the function to be executed.
In the former case, it seems that only letter, number, and under_score can be used for pagePath, and there are various other rules, so I couldn't use it.
When I was looking for an article, this person's article was easy to understand. Google Cloud Storage (GCS) with Python, csv save pandas DataFrame to BigQuery class
I was wondering if it would be a good idea to go around this kind of thing, but it took a long time, so I hope I can help if there are other people who are the same. Please be careful of Corona, too!
Recommended Posts