Python vs Ruby "Deep Learning from scratch" Chapter 3 Implementation of 3-layer neural network

Overview

Implement a three-layer neural network in Python and Ruby by referring to the code in Chapter 3 of the book "Deep Learning from scratch-The theory and implementation of deep learning learned in Python".

An external library is used in the calculation process. Use NumPy for Python and Numo :: NArray for Ruby.

If you need to build an environment, see here. → Python vs Ruby "Deep Learning from scratch" Chapter 1 Graph of sin and cos functions http://qiita.com/niwasawa/items/6d9aba43f3cdba5ca725

Implementation of 3-layer neural network (source code)

Python

import numpy as np

#Weight and bias initialization
def init_network():
  network = {}
  network['W1'] = np.array([[0.1, 0.3, 0.5], [0.2, 0.4, 0.6]])
  network['b1'] = np.array([0.1, 0.2, 0.3])
  network['W2'] = np.array([[0.1, 0.4], [0.2, 0.5], [0.3, 0.6]])
  network['b2'] = np.array([0.1, 0.2])
  network['W3'] = np.array([[0.1, 0.3], [0.2, 0.4]])
  network['b3'] = np.array([0.1, 0.2])
  return network

#Convert input signal to output
def forword(network, x):
  W1, W2, W3 = network['W1'], network['W2'], network['W3']
  b1, b2, b3 = network['b1'], network['b2'], network['b3']
  a1 = np.dot(x, W1) + b1
  z1 = sigmoid(a1)
  a2 = np.dot(z1, W2) + b2
  z2 = sigmoid(a2)
  a3 = np.dot(z2, W3) + b3
  y = identity_function(a3)
  return y

#Identity function
def identity_function(x):
  return x

#Sigmoid function
def sigmoid(x):
  return 1 / (1 + np.exp(-x))

#Run
network = init_network()
x = np.array([1.0, 0.5]) #Input layer
y = forword(network, x)
print(y)

Ruby

require 'numo/narray'

#Weight and bias initialization
def init_network()
  network = {}
  network['W1'] = Numo::DFloat[[0.1, 0.3, 0.5], [0.2, 0.4, 0.6]]
  network['b1'] = Numo::DFloat[0.1, 0.2, 0.3]
  network['W2'] = Numo::DFloat[[0.1, 0.4], [0.2, 0.5], [0.3, 0.6]]
  network['b2'] = Numo::DFloat[0.1, 0.2]
  network['W3'] = Numo::DFloat[[0.1, 0.3], [0.2, 0.4]]
  network['b3'] = Numo::DFloat[0.1, 0.2]
  network
end

#Convert input signal to output
def forword(network, x)
  w1 = network['W1']; w2 = network['W2']; w3 = network['W3']
  b1 = network['b1']; b2 = network['b2']; b3 = network['b3']
  a1 = x.dot(w1) + b1
  z1 = sigmoid(a1)
  a2 = z1.dot(w2) + b2
  z2 = sigmoid(a2)
  a3 = z2.dot(w3) + b3
  identity_function(a3)
end

#Identity function
def identity_function(x)
  x
end

#Sigmoid function
def sigmoid(x)
  1 / (1 + Numo::NMath.exp(-x)) # Numo::Returns DFloat
end

#Run
network = init_network()
x = Numo::DFloat[1.0, 0.5] #Input layer
y = forword(network, x)
puts y.to_a.join(' ')

Execution result

Python

[ 0.31682708  0.69627909]

Ruby

0.3168270764110298 0.6962790898619668

Reference material

--Python vs Ruby "Deep Learning from scratch" Summary --Qiita http://qiita.com/niwasawa/items/b8191f13d6dafbc2fede

Recommended Posts

Python vs Ruby "Deep Learning from scratch" Chapter 3 Implementation of 3-layer neural network
Python vs Ruby "Deep Learning from scratch" Chapter 4 Implementation of loss function
Python vs Ruby "Deep Learning from scratch" Chapter 3 Graph of step function, sigmoid function, ReLU function
Python vs Ruby "Deep Learning from scratch" Chapter 1 Graph of sin and cos functions
Python vs Ruby "Deep Learning from scratch" Summary
Python vs Ruby "Deep Learning from scratch" Chapter 2 Logic circuit by Perceptron
Deep Learning from scratch The theory and implementation of deep learning learned with Python Chapter 3
Implementation of 3-layer neural network (no learning)
Chapter 3 Neural Network Cut out only the good points of deep learning made from scratch
[Learning memo] Deep Learning from scratch ~ Implementation of Dropout ~
[Deep Learning from scratch] Initial value of neural network weight using sigmoid function
PRML Chapter 5 Neural Network Python Implementation
[Deep Learning from scratch] Implementation of Momentum method and AdaGrad method
[Deep Learning from scratch] Initial value of neural network weight when using Relu function
[Learning memo] Deep Learning made from scratch [Chapter 7]
Lua version Deep Learning from scratch Part 6 [Neural network inference processing]
Deep learning / Deep learning made from scratch Chapter 6 Memo
[Learning memo] Deep Learning made from scratch [Chapter 5]
[Learning memo] Deep Learning made from scratch [Chapter 6]
Deep learning / Deep learning made from scratch Chapter 7 Memo
Learning record of reading "Deep Learning from scratch"
Chapter 2 Implementation of Perceptron Cut out only the good points of deep learning made from scratch
[Learning memo] Deep Learning made from scratch [~ Chapter 4]
Chapter 1 Introduction to Python Cut out only the good points of deep learning made from scratch
[Deep Learning from scratch] Layer implementation from softmax function to cross entropy error
[Python / Machine Learning] Why Deep Learning # 1 Perceptron Neural Network
Deep Learning from scratch ① Chapter 6 "Techniques related to learning"
Deep Learning from scratch Chapter 2 Perceptron (reading memo)
Deep Learning from scratch
[Deep Learning from scratch] I implemented the Affine layer
Application of Deep Learning 2 made from scratch Spam filter
Othello ~ From the tic-tac-toe of "Implementation Deep Learning" (4) [End]
[Deep Learning] Execute SONY neural network console from CUI
Deep Learning from scratch 1-3 chapters
Neural network implementation in python
Deep reinforcement learning 2 Implementation of reinforcement learning
Rank learning using neural network (Implementation of RankNet by Chainer)
Try to build a deep learning / neural network with scratch
An amateur stumbled in Deep Learning from scratch Note: Chapter 1
Making from scratch Deep Learning ❷ An amateur stumbled Note: Chapter 5
Making from scratch Deep Learning ❷ An amateur stumbled Note: Chapter 2
Non-information graduate student studied machine learning from scratch # 2: Neural network
An amateur stumbled in Deep Learning from scratch Note: Chapter 3
An amateur stumbled in Deep Learning from scratch Note: Chapter 7
An amateur stumbled in Deep Learning from scratch Note: Chapter 5
Making from scratch Deep Learning ❷ An amateur stumbled Note: Chapter 7
Making from scratch Deep Learning ❷ An amateur stumbled Note: Chapter 1
[With simple explanation] Scratch implementation of deep Boltzmann machine with Python ②
Making from scratch Deep Learning ❷ An amateur stumbled Note: Chapter 4
[With simple explanation] Scratch implementation of deep Boltzmann machine with Python ①
An amateur stumbled in Deep Learning from scratch Note: Chapter 4
An amateur stumbled in Deep Learning from scratch Note: Chapter 2
Making from scratch Deep Learning ❷ An amateur stumbled Note: Chapter 6
[Deep Learning from scratch] Implement backpropagation processing in neural network by error back propagation method
Deep Learning / Deep Learning from Zero 2 Chapter 4 Memo
Deep Learning / Deep Learning from Zero Chapter 3 Memo
Deep Learning / Deep Learning from Zero 2 Chapter 5 Memo
Implementation of a two-layer neural network 2
Deep learning from scratch (cost calculation)
Deep Learning / Deep Learning from Zero 2 Chapter 7 Memo
Deep Learning / Deep Learning from Zero 2 Chapter 8 Memo