"Deep Learning from scratch" in Haskell (unfinished)
It's the year-end and New Year holidays! Let's write the code!
So, I tried the challenge of "doing Deep Learning from scratch with Haskell".
Why use Haskell?
- Haskell is cool, isn't it?
- It doesn't taste good if you just follow the book
- There seems to be fusion, and even in the field of deep learning where calculation efficiency is likely to be required, Haskell is actually Isn't it suitable? I wanted to verify that
Conclusion
** Failure. Haskell wasn't strong enough **
I will give you the code I wrote with the feeling that I can overcome my corpse.
https://github.com/spinylobster/my-deep-learning-from-scratch-in-haskell
~~ The progress is not good at all, I wrote the inference and learning of the neural network, but I have not confirmed the actual learning.
Since the error back propagation method has not been implemented, it feels like it is in the middle of Chapter 4 ~~
- Addition 2017/1/9: A little advanced, the state where the learning of the neural network was completed
I'm also implementing AdaGrad, so it feels like it's in the middle of Chapter 6 *
Haskell's power is low, so the code is complicated, and it does not follow the code of the book, but it may be a reference when using various libraries.
What you think is the cause
- Spent too much time studying Haskell
- Relearning forgotten knowledge, using Stack, monad transformers, data serialization, etc.
- Originally, I did Wow H books a few years ago.
- I spent a lot of time figuring out how to use Gnuplot
- I had a calculation that had to have a state, but I didn't know how to write it
- ~~ Matrix calculation is not safe and I suffered from run-time errors ~~-> ** type It seems that safe calculation is possible! **
- It's really hard to debug the running state in Haskell
What I used
- Stack --Without this, you can install each package.
- Haskell Super Primer-To remember the basic grammar
- HMatrix --The equivalent of NumPy. Matrix calculation. There seems to be Repa, and I planned to compare the performance later.
- Gnuplot --The equivalent of matplotlib. Drawing a graph. To understand how to use it, it is quick to see here.
- GHCi Debugger --I first learned of its existence. Lazy evaluation and the debugger seem to be incompatible. It was really hard to use
- Other libraries-see cabal file
Impressions of the book
- ** Insanely good ** Easy to understand
- It covers from the introduction of neural networks to what to learn in the future. ** I think there is no better introductory book **
- However, I don't really understand the essential deep learning.
- I don't understand the design intent of the CNN convolution layer
- I think it's possible that I didn't write it by myself
- I want to understand it together with other materials
Helpful URL
- Prepare Haskell environment with VS Code
- http://qiita.com/DUxCA/items/8e7a68ffee522bdd8918
- Install gnuplot on Mac
- http://mashiroyuya.hatenablog.com/entry/installgnuplot
- Use gnuplot from Haskell
- http://d.hatena.ne.jp/mmitou/20120912/1347441319
- Understanding how to use hMatrix
- https://github.com/albertoruiz/hmatrix/tree/master/examples
- You can use
+-* /
for element-wise calculations
- The map for the matrix is
cmap
I feel that there are times when broadcasting like * matrix * 2
is possible and times when it is not possible. I'm not sure
- Haskell Library Impressions 2016
- http://syocy.hatenablog.com/entry/haskell-library-2016
- Type-level programming in Haskell
- http://konn-san.com/articles/2012-06-07-promoted-types-and-list-arguments.html