Google Colab is amazing and seems to be easy to use GPU for free. I was taught how to use it on AWS, but it's an order of magnitude easier and I cry It ’s a story about doing machine learning with Pytorch.
In order to use it well, I will try it while looking at various Qiita and write down the method I made. Maybe there was a lot of old information and it was quite difficult
Runtime> Change Runtime Type> Select GPU
!pip install torch
import torch
torch.cuda.is_available()
If you execute True and get True, you can use the GPU It seems that Pytorch contains something called CUDA for using GPU and can be used (worst explanation)
There seems to be quite a way to use files.
from google.colab import files
uploaded = files.upload()
You can upload the file by entering
However, there are times when I want to use the entire dataset, so even if I can upload only the files, I'm in trouble. Is it possible to handle folders in this way?
This is good There are some difficult articles if you put in a lot of difficult code or if you don't want to authenticate, but I wonder if it has become convenient recently. It was really easy
First, click "Mount Drive" from> on the left side. Then the code below was added all the time
from google.colab import drive
drive.mount('/content/drive')
Just put it in, execute it, and copy something with the link that came out!
/ content / drive / My Drive / Untara
If you write, it seems that you can treat it like a local folder, awesome ~ Thank you ~
When I use colab to summarize things, I sometimes want to paste a reference image, but it is not connected to the local and there is no insertion. In such a case, first upload the image file from the side and name it Figure_1.png
.
from IPython.display import Image,display_png
display_png(Image('Figure_1.png'))
If you execute, the image will not disappear even after a while, so it's good. There is a feeling of first aid
Recommended Posts