Introducing Udacity Deep Learning Nanodegree

tldr I received a Deep Learning Nanodegree from Udacity. Tensorflow is hard. Tuning of Hyper Parameter is difficult. But I learned a lot.

What is Udacity Nanodegree?

Udacity is a so-called MOOC (Massive Open Online Course) where you can learn programming and design as an online course. Udacity itself is free, but it offers a paid mini-degree (Nanodegree) that combines several courses, and some courses can only be seen there.

There are other online courses such as Lynda.com and Code School, but you are required to submit assignments in addition to watching the video, and you will not pass unless you meet the required standards. The difference from the same type of Coursera is that Coursera is more like a university lecture, while Udacity is more like a corporate training.

What is Deep Learning Nanodegree?

Deep Learning Nanodegree is a new Nanodegree that started in 2017.

We specialize in deep learning, and if you finish successfully, you will be admitted to autonomous driving and Artificial Intelligence Nanodegree. According to Techcrunch Articles, only 16% and 4.5% are admitted, respectively. It's worth it because it's said that it is, but in the automatic driving course, if you insist that you can do Python by yourself as a mechanical graduate, you passed normally, so if you are an engineering graduate, you may be able to go straight to the course you are looking for. not.

Note that Deep Learnig Nanodegree does not have non-deep machine learning like SVM and Decision Tree, which are found in autonomous driving courses. On the contrary, this is the only RNN system (at least as far as I confirmed up to Term 1). If it is Machine Learning Engineer Nanodegree, will both be included?

Siraj doesn't tell me so much

By the way, in this course, when you open the Udacity page, youtuber's Siraj will teach you in a big way. udacity_top.png

However, when I actually tried it, I found that the main instructor was Mr. Mat, who is next to me. There are some preliminary explanation parts before the project of the assignment, but the instructor there is also Mr. Mat. For the time being, the Siraj course and the Mat course are arranged alternately, but since the Siraj part is an independent reprint of Youtube, I got the impression that it was in charge of introduction to lower the threshold.

In the GAN part, Ian Goodfelow, the author of the GAN paper, will explain GAN himself. It's amazing. ian.png

The framework is Tensorflow

Now, the actual lecture involves work on the Jupyter Notebook in addition to the explanation videos and texts on the Web. All the code is [published] on Github (https://github.com/udacity/deep-learning) and the assignment will be to submit a filled Jupyter Notebook.

Due to the nature of deep learning, GPU is required, so you can get a free tier for AWS and Floyd Hub, but for the convenience of running Jupyter Notebook as a server, anyone can see it without IP restrictions, so it is safe to have a GPU machine at hand. maybe.

Since the framework is Tensorflow, it is often required to be implemented at a lower layer than Chainer and Keras. It can be said that it will give you that much power, but you will be required to commit a reasonable amount of time. There are Slack channels and Forums, but even if you're on schedule, you're said to be in the Top 20% at the end of Project 4, so the frustration rate may be high.

Actual lecture content

Below are the contents that you can actually learn.

  1. Neural Networks Instructions on how to use Anaconda and Jupyter Notebook. The challenge is to build a Neural Network and use numpy to implement a sigmoid function-based neural network.

  2. Convolutional Networks Explanation of Sentiment Analysis and TF Learn, implementation of error back propagation method, explanation of CNN. The challenge is image classification. There is also a description of Dopout.

  3. Recurrent Neural Networks RNN, Word2vec, hyperparameters, TensorBoard, initialization, transfer learning, seq2seq, reinforcement learning (just touch). The challenge is the generation of Simpsons conversations and machine translation.

Personally, the most interesting part was the initialization part. It was not good to compare the final accuracy due to the difference in range ([0,1] or [-1,1]) and the difference in distribution (uniform, truncated_normal).

  1. Generative Adversarial Networks GAN, DCGAN commentary, semi-supervised learning. The challenge is to generate a face image. It was easy to understand because there was an explanation of the parts that are easy to make mistakes in implementation. However, even with that, tuning does not go well, and GAN is difficult.

Throughout

During the lecture, there were announcements of Tensorflow 1.0 and Keras 2.0. Even in the latter half of the lecture, the sample version was Tensorflow 1.0 and the response was very quick.

Since the field of machine learning is evolving rapidly, information tends to be out of date in paper books. The real thrill of the online course is that it can be updated continuously, so I felt that it was a good fit.

Recommended Posts

Introducing Udacity Deep Learning Nanodegree
Deep Learning
Deep Learning Memorandum
Start Deep learning
Python Deep Learning
Deep learning × Python
First Deep Learning ~ Struggle ~
Deep Learning from scratch
Deep learning 1 Practice of deep learning
Deep learning / cross entropy
First Deep Learning ~ Preparation ~
First Deep Learning ~ Solution ~
[AI] Deep Metric Learning
I tried deep learning
Python: Deep Learning Tuning
Deep learning large-scale technology
Deep learning / softmax function
Impressions of taking the Udacity Machine Learning Engineer Nano-degree
Deep Learning from scratch 1-3 chapters
Try deep learning with TensorFlow
<Course> Deep Learning: Day2 CNN
Japanese translation of public teaching materials for Deep learning nanodegree
Deep learning image recognition 1 theory
Deep running 2 Tuning of deep learning
Deep learning / LSTM scratch code
Rabbit Challenge Deep Learning 1Day
<Course> Deep Learning: Day1 NN
Try Deep Learning with FPGA
Deep learning for compound formation?
Subjects> Deep Learning: Day3 RNN
Introduction to Deep Learning ~ Learning Rules ~
Rabbit Challenge Deep Learning 2Day
Deep Reinforcement Learning 1 Introduction to Reinforcement Learning
Deep reinforcement learning 2 Implementation of reinforcement learning
Generate Pokemon with Deep Learning
Introduction to Deep Learning ~ Backpropagation ~
Deep Learning / Deep Learning from Zero 2 Chapter 4 Memo
Try Deep Learning with FPGA-Select Cucumbers
Cat breed identification with deep learning
Deep Learning / Deep Learning from Zero Chapter 3 Memo
Make ASCII art with deep learning
Deep Learning / Deep Learning from Zero 2 Chapter 5 Memo
Implement Deep Learning / VAE (Variational Autoencoder)
Try deep learning with TensorFlow Part 2
Deep learning from scratch (cost calculation)
Deep learning to start without GPU
Solve three-dimensional PDEs with deep learning.
Introduction to Deep Learning ~ Coding Preparation ~
Organize machine learning and deep learning platforms
Deep learning learned by implementation 1 (regression)
Deep Learning / Deep Learning from Zero 2 Chapter 7 Memo
Deep Learning / Deep Learning from Zero 2 Chapter 8 Memo
Microsoft's Deep Learning Library "CNTK" Tutorial
Deep Learning / Deep Learning from Zero Chapter 5 Memo
Check squat forms with deep learning
Deep Learning / Deep Learning from Zero Chapter 4 Memo
Deep Reinforcement Learning 3 Practical Edition: Breakout
Deep learning image recognition 2 model implementation
Categorize news articles with deep learning
Deep Learning / Deep Learning from Zero 2 Chapter 3 Memo
I tried deep learning using Theano