I made a neural network generator that runs on FPGA

Overview

I made a Neural Network generator that runs on FPGA.

function

--FeedForward neural network can be manufactured --Output is 0,1 only --Output with Verilog HDL --Asynchronous circuit (clock independent) --Supports fixed decimals with arbitrary bit width --The activation function is a sigma function --Inference only (learning is not possible)

The repository is here https://github.com/kotauchisunsun/NN_FPGA

Operating environment

--Python 2.7 or higher and less than 3

Python2.7 is required for generators, but Icurus Verilog is not required without simulation. No special library is required, only the Python standard library works.

license

Affero GPL v3

How to use

Structural part of Neural Network The basic grammar is

$ python script/main.py width input_num structure output_num

width: decimal bit width input_num: number of input signals structure: Shows the network structure. Multi-stage structure can be expressed by separating with commas (described later) output_num: Number of output signals

Example)

$ python script/main.py 16 2 2 2
> NN_NL_016_0002_0002_NL_016_0002_0002
> saved to generate.v
> None

As a result, a neural network called NN_NL_016_0002_0002_NL_016_0002_0002 that operates with a 16-bit decimal number is constructed in generate.v. This represents a neural network as shown in the figure below. image

The meaning of each input *: input output *: Output w *: Weight factor for input b *: Neural network bias

For example, the condition for the neural network on the upper left to fire is

if input1 * w1 + input2 * w3 + b1 > 0:
    return 1
else:
    return 0

It is.

Also, the order of the arguments passed to NN_NL_016_0002_0002_NL_016_0002_0002,

  1. input*
  2. w*
  3. b*
  4. output* It is the order of. Therefore, if you write down all of this case
input1,input2,w1,w2,w3,w4,w5,w6,w7,w8,b1,b2,b3,b4,ouput1,output2

It is. The result of inferring the neural network is in output1 and output2.

In addition, this main.py can also build a multi-stage neural network,

$ python main.py 16 32 64,32,48 16

Then, it can handle 16-bit wide decimals.

layer Number of neural network units
Input layer 32
Hidden layer 1 64
Hidden layer 2 32
Hidden layer 3 48
Output layer 16

You can construct a neural network called.

Decimal representation

This neural network does not support the commonly used floating point numbers. Therefore, it takes some time to substitute the weighting factor and the input.

The decimal specifications are as follows. When width = 16 1 to 8 bits: Each bit in the decimal part represents 2 ^ (-8 + i-1). (C) 9 to 15 bits: Integer part Each bit represents 2 ^ (i-8-1). (N) 16bit: Negative number flag. (F)

bit 16 15 14 13 12 11 10 9 8 7 6 5 4 3 2 1
meaning F N N N N N N N C C C C C C C C

It is. Bits up to width / 2 are decimal parts, top bits are negative number flags, and others are integer parts.

It's confusing to hear this much, so I prepared a script.

$ python script/convert_number.py width number
``
width:Decimal bit width
number:Value you want to convert

Example:

$ python script/convert_number.py 16 5.5

16b'0000010110000000 ABS ERROR = 0.000000e+00


As a result, 5 is a 16-bit decimal number..When 5 is expressed, 16b'You can see that it becomes 0000010110000000.
When building with Verilog, you can see that it is okay to enter this value.
Here, ABS ERROR is an error when expressed as a decimal number. When a decimal number is expressed as a 16-bit binary number, how much absolute error is generated is expressed in decimal notation.

Example:

$ python script/convert_number.py 16 -1.1

16b'1111111011100111 ABS ERROR = 2.343750e-03


-1.When 1 is represented by a fixed decimal number of 16 bits, ABS ERROR= 2.343750e-Since it is 03, you can see that there is an error. Therefore, if it is expressed as a 32-bit fixed decimal number,

Example:

$ python script/convert_number.py 32 -1.1

32b'11111111111111101110011001100111 ABS ERROR = 9.155273e-06


ABS ERROR = 9.155273e-It becomes 06, and it can be seen that the error is reduced compared to when expressed in 16 bits. If the bit width is increased, the accuracy will increase, but the circuit scale will increase and it will not work on the FPGA, so tune while balancing.

#At the end

It's been 4 months since I started Verilog HDL and FPGA. If you have any bugs, mistakes, expansion policies, etc., please let us know.


Recommended Posts

I made a neural network generator that runs on FPGA
I ran the neural network on the actual FPGA
I made a VM that runs OpenCV for Python
I implemented a two-layer neural network
I tried to create a server environment that runs on Windows 10
I made an image discrimination (cifar10) model using a convolutional neural network.
I made a LINE Bot that sends recommended images every day on time
I made a Linebot that notifies me of nearby evacuation sites on AWS
I tried a neural network Π-Net that does not require an activation function
〇✕ I made a game
I made a Python3 environment on Ubuntu with direnv.
[Django] Hit a command you made from within the process that runs on manage.py.
A story that stumbled when I made a chatbot with Transformer
I tried to implement a basic Recurrent Neural Network model
Implement a 3-layer neural network
I want a mox generator
Create a web application that recognizes numbers with a neural network
I made a class that easily performs unixtime ← → datetime conversion
I made a program to collect images in tweets that I liked on twitter with Python
I made a fucking app that won't let you skip
I want a mox generator (2)
I made a VGG16 model using TensorFlow (on the way)
I made an anomaly detection model that works on iOS
I made a rigid Pomodoro timer that works with CUI
I made a python text
I made a discord bot
I tried to classify music major / minor on Neural Network
Construction of a neural network that reproduces XOR by Z3
I tried a convolutional neural network (CNN) with a tutorial on TensorFlow on Cloud9-Classification of handwritten images-
I made a plug-in that can "Daruma-san fell" with Minecraft
I made a garbled generator that encodes favorite sentences from UTF-8 to Shift-JIS (cp932) in Python
[Python] I made a Line bot that randomly asks English words.
[Python3] I made a decorator that declares undefined functions and methods.
I made a simple network camera by combining ESP32-CAM and RTSP.
I made a package that can compare morphological analyzers with Python
I installed Taiga.IO on CentOS7 (I made a script while I was there)
I made a program that solves the spot the difference in seconds
Easy! Implement a Twitter bot that runs on Heroku in Python
I made a Twitter bot that mutters Pokemon caught by #PokemonGO
I made a shuffle that can be reset (reverted) with Python
I made a lo command that is more useful than ls
I made a library that adds docstring to a Python stub file.
I made a program that automatically calculates the zodiac with tkinter
[python] I made a class that can write a file tree quickly
I made a C ++ learning site
I made a Line-bot using Python!
I made a CUI-based translation script (2)
Implementation of a two-layer neural network 2
I made a fortune with Python.
I made a CUI-based translation script
[Python] I made a bot that tells me the current temperature when I enter a place name on LINE
I made a web application that converts photos into Van Gogh's style
I made a calendar that automatically updates the distribution schedule of Vtuber
[Python] I made a decorator that doesn't seem to have any use.
I made a password generator to teach Python3 to children (bonus) * Completely remade
I made a web application in Python that converts Markdown to HTML
I made a plug-in "EZPrinter" that easily outputs map PDF with QGIS.
Suspicious attacks that came as soon as I launched a blog on EC2
I made a Discord bot in Python that translates when it reacts
[Python] I made a utility that can access dict type like a path
I made a simple timer that can be started from the terminal