Implement a 3-layer neural network

This article is a learning memo of Deep Learning from scratch.

The point

--Activation function: It has the role of determining how the sum of the input signals is activated (ignites). In Perceptron, a "step function" that switches the output at the threshold value is used, but in a neural network, a "sigmoid function" or "ReLU function" that draws a smooth curve is used. The activation function of the hidden layer is represented by h (), and the activation function of the output layer is represented by σ (). --By using the inner product of the matrix, it is possible to perform calculations for each neuron in a single layer.

3-layer neuron network diagram

Untitled_ニューロンネットワーク_-_Cacoo.png

Sigmoid function

  h(x) = \frac{1}{1+ \mathrm{e}^{-x}}

Conversion formula for each neuron

  a = w_1x_1+w_2x_2+b  
  z = h(a) 

Implement a three-layer neuron network

py3:3layered_neuralnetwork.py.py


import numpy as np
import matplotlib.pyplot as plt

#Weight and bias initialization
def init_network():
    network = {}
    #1st layer
    network['W1'] = np.array([[0.1, 0.3, 0.5], [0.2, 0.4, 0.6]])
    network['b1'] = np.array([0.1, 0.2, 0.3])
    #2nd layer
    network['W2'] = np.array([[0.1, 0.4], [0.2, 0.5], [0.3, 0.6]])
    network['b2'] = np.array([0.1, 0.2])
    #3rd layer
    network['W3'] = np.array([[0.1, 0.3], [0.2, 0.4]])
    network['b3'] = np.array([0.1, 0.2])

    return network

#Input → Output
def forward(network, x):
    W1, W2, W3 = network['W1'], network['W2'], network['W3']
    b1, b2, b3 = network['b1'], network['b2'], network['b3']

    #1st layer
    a1 = np.dot(x, W1) +b1  # A = XW +B
    z1 = sigmoid(a1)        # Z = h(A)
    #2nd layer
    a2 = np.dot(z1, W2) +b2
    z2 = sigmoid(a2)
    #3rd layer
    a3 = np.dot(z2, W3) +b3
    y = identity_function(a3)   #Only the last layer has a different activation function

    return y

#Sigmoid function(Activation function)
def sigmoid(x):
    return 1 / (1 + np.exp(-x))

#Identity function(Activation function)
def identity_function(x):
    return x

#Check the operation below
network = init_network()
x = np.array([1.0, 0.5])
y = forward(network, x)
print(y) # [0.31682708  0.69627909]

Recommended Posts

Implement a 3-layer neural network
Implement Convolutional Neural Network
Implement Neural Network from 1
Visualize the inner layer of a neural network
Implementation of a two-layer neural network 2
What is a Convolutional Neural Network?
I implemented a two-layer neural network
Compose with a neural network! Run Magenta
Implementation of 3-layer neural network (no learning)
Parametric Neural Network
Experiment with various optimization algorithms with a neural network
Reinforcement learning 10 Try using a trained neural network.
Convolutional neural network experience
Train MNIST data with a neural network in PyTorch
The story of making a music generation neural network
Implement feedforward neural network in Chainer to classify documents
Basics of PyTorch (2) -How to make a neural network-
Implementation of a convolutional neural network using only Numpy
Create a web application that recognizes numbers with a neural network
Try to build a deep learning / neural network with scratch
Neural network with Python (scikit-learn)
3. Normal distribution with neural network!
Neural network to understand and implement in high school mathematics
Neural network starting with Chainer
Try building a neural network in Python without using a library
Neural network implementation in python
Pytorch Neural Network (CNN) Tutorial 1.3.1.
4. Circle parameters with neural network!
Construction of a neural network that reproduces XOR by Z3
[Causal search / causal inference] Implement a Bayesian network with Titanic data
Neural network implementation (NumPy only)
TensorFlow Tutorial-Convolutional Neural Network (Translation)
I made a neural network generator that runs on FPGA
[Deep Learning from scratch] About the layers required to implement backpropagation processing in a neural network
Understand the number of input / output parameters of a convolutional neural network
Simple neural network implementation using Chainer
Neural network with OpenCV 3 and Python 3
Implement a date setter in Tkinter
PRML Chapter 5 Neural Network Python Implementation
Simple classification model with neural network
Write a Residual Network with TFLearn
Implement a Django app on Hy
[TensorFlow] [Keras] Neural network construction with Keras
Simple neural network theory and implementation
Touch the object of the neural network
[Language processing 100 knocks 2020] Chapter 8: Neural network
Build a classifier with a handwriting recognition rate of 99.2% with a TensorFlow convolutional neural network
I made an image discrimination (cifar10) model using a convolutional neural network.