Reinforcement learning 23 Create and use your own module with Colaboratory

It is a method to create and use your own module with Colaboratory.

First, create a folder for module in Google Drive. The name is chokozainerRL. Create an empty file \ _ \ _ init__.py in chokozainerRL. I made a file with VSCode and uploaded it. Next, create a test.py file like the one below and upload it.

test.py


class Test:
  def sayStr(self, str):
    print(str)
In the chokozainer folder
    __test__.py
    test.py

The notebook looks like the following.

import google.colab.drive
google.colab.drive.mount('gdrive')
!ln -s gdrive/My\ Drive mydrive
!ln -s gdrive/My\ Drive/chokozainer chokozainer

from chokozainerRL import test
a=test.Test()
a.sayStr("Hello Papa")

As the program grows, it becomes more convenient to create your own module.

Recommended Posts

Reinforcement learning 23 Create and use your own module with Colaboratory
[Reinforcement learning] DQN with your own library
Create your own DNS server with Twisted
Create your own Composite Value with SQLAlchemy
To import your own module with jupyter
Use TPU and Keras with Google Colaboratory
Put your own image data in Deep Learning and play with it
Create your own virtual camera with Python + OpenCV and apply original effects
[Introduction to pytorch-lightning] How to use torchvision.transforms and how to freely create your own dataset ♬
Extend and inflate your own Deep Learning dataset
Make your own module quickly with setuptools (python)
Create a wheel of your own OpenCV module
"Learning word2vec" and "Visualization with Tensorboard" on Colaboratory
[Machine learning] Create a machine learning model by performing transfer learning with your own data set
Create your own exception
Memo to create your own Box with Pepper's Python
Reinforcement learning 18 Colaboratory + Acrobat + ChainerRL
Play with reinforcement learning with MuZero
Reinforcement learning 17 Colaboratory + CartPole + ChainerRL
Reinforcement learning 28 colaboratory + OpenAI + chainerRL
Reinforcement learning 19 Colaboratory + Mountain_car + ChainerRL
Reinforcement learning starting with Python
Reinforcement learning 20 Colaboratory + Pendulum + ChainerRL
How to use pyenv and pyenv-virtualenv in your own way
Recognize your boss and hide the screen with Deep Learning
Create your own Django middleware
Create your own graph structure class and its drawing in python
[Introduction to StyleGAN] Unique learning of anime with your own machine ♬
Create your own IoT platform using raspberry pi and ESP32 (Part 1)
Introduction to Deep Learning (2) --Try your own nonlinear regression with Chainer-
Reinforcement learning 21 Colaboratory + Pendulum + ChainerRL + A2C
Reinforcement learning 13 Try Mountain_car with ChainerRL.
[Python] logging in your own module
Solve your own maze with Q-learning
How to create your own Transform
Reinforcement learning 22 Colaboratory + CartPole + ChainerRL + A3C
Explore the maze with reinforcement learning
Create your own name resolution service
[Django] Create your own 403, 404, 500 error pages
Reinforcement learning 24 Colaboratory + CartPole + ChainerRL + ACER
Train UGATIT with your own dataset
Solve your own maze with DQN
Let's make an image recognition model with your own data and play!
Don't use your username and password to register with PyPI. Use API tokens
Learn with Shogi AI Deep Learning on Mac and Google Colab Use Google Colab
Use MeCab and neologd with Google Colab
Your own Twitter client made with Django
Create your own Linux commands in Python
Create wordcloud from your tweet with python3
[LLDB] Create your own command in Python
Easily use your own functions in Python
[Python] Package and distribute your own modules
Use Jupyter Lab and Jupyter Notebook with EC2
Make your own PC for deep learning
Create your first app with Django startproject
Publish your own Python library with Homebrew
[Python] Easy Reinforcement Learning (DQN) with Keras-RL
Use Python and MeCab with Azure Functions
Reinforcement learning 11 Try OpenAI acrobot with ChainerRL.