This article is an easy-to-understand output of ** Deep Learning from scratch Chapter 7 Learning Techniques **. I was able to understand it myself in the humanities, so I hope you can read it comfortably. Also, I would be more than happy if you could refer to it when studying this book.
Do you know what ensemble learning is? Ensemble learning is the one that can produce good learning results by learning using multiple models. Dropout improves the learning results by reproducing the ensemble learning in a simulated manner.
Is Dropout doing the inside specifically? It is to randomly erase neurons during learning. Dropout creates ensemble learning by creating multiple different models by randomly erasing neurons.
Below is a simple implementation example.
class Dropout:#It is generated after the activation function layer and activated every time learning is performed. Do not activate with predict
def __init__(self,dropout_ratio=0.5):
self.dropout_ratio = dropout_ratio
self.mask = None #Contains an array of neurons to be erased
def forward(self,x,train_flg=True):
if train_flg:
self.mask = np.random.rand(*x.shape) > self.dropout_ratio#Randomly determine neurons to erase
return x * self.mask
else:
return x * (1 - self.dropout_ratio)
def backward(self,dout):
return dout * self.mask#Same as Relu
Recommended Posts