Previously, I wrote an article Building a GCP environment for Kaggle easily. This time, I found an easier way to build it, so I'll write it as an improved version of the previous article. A function called a notebook has been added to the lower AI platform. What this is is that you can get it ready to use without having to do the work of getting the jupyter notebook introduced in the previous article. To be precise, Jupyter Lab will be launched, but if you have used jupyter notebook, you will soon be familiar with it.
You can set up an instance by pressing the button called New Instance at the top. This instance seems to be shared with GCE. In this environment, you can select the framework you want to use and set up the execution environment as easily as the Deep learning VM.
After setting up the instance, click Open JUPYTER LAB. If you select Notebook here, jupyter notebook will be launched. It's really easy, isn't it: smiley:
After that, you can enter commands from Terminal, and you can also create a Python file from Text File and rename it to .py without any problem.
As for files, the GUI is maintained so you can easily upload and download them.
With this feature implemented, it's much easier to create a computational environment on GCP. Recently, Kaggle's Kernel has strict usage restrictions, so I think that many people are using Google Colab, but since you can get a coupon for 30,000 yen, I think it's a good idea to take this opportunity to get started with GCP. I will. If you have any questions, please feel free to comment. P.S I'm currently using this feature in a competition, but apparently there's a problem handling large files. For example, if you try to download a heavy file from Jupyter lab, it will download a clearly smaller file, or if you try to pip install PyTorch, it will stop halfway. The workaround is to use gsutil to write to storage or create an instance with PyTorch from the beginning ... I look forward to future improvements!