[Machine learning] Create a machine learning model by performing transfer learning with your own data set

1.First of all

Previous articles ( Scraping , Image processing with OpenCV ) introduced how to create your own dataset. In this article, I would like to introduce how to create a machine learning model that uses the created data set to perform transfer learning and judge Japanese and foreigners.

2. What I want to do this time

Prepare the data set required to create a machine learning model ← Contents up to the previous time ↓ ---------- What to do in this article from here ---------- ↓ Create a machine learning model using transfer learning. ↓ Use machine learning models to determine photos of Japanese and foreigners.

3. What is transfer learning?

In a nutshell, transfer learning is a model learning method used to improve the performance of machine learning models in a short period of time. In general, the performance (accuracy) of a machine learning model is better if the layer of the machine learning model is deep and wide. However, building such a deep and wide machine learning model from scratch requires a huge amount of time and data. Therefore, transfer learning is to use the part other than the fully connected layer of the existing high-performance machine learning model (VGG16 etc.) as the feature extraction layer, and then build and train the part of the fully connected layer by yourself. .. As a result, compared to building a machine learning model from scratch, transfer learning requires only learning the fully connected layer, so it is possible to build a machine learning model with good performance in a short time. 転移学習.png Source: "What is transfer learning? How do you do the expected" transfer learning "in deep learning?

4. Source code

The source code used this time is shown below.

cnn.py


from keras.layers import Dense, Dropout, Flatten, Activation
from keras.layers import Conv2D, MaxPooling2D, Input, BatchNormalization
from keras.models import Sequential, load_model, Model
from keras.applications.vgg16 import VGG16
from keras.optimizers import SGD
from keras.preprocessing import image
import numpy as np
import matplotlib.pyplot as plt
import ssl
ssl._create_default_https_context = ssl._create_unverified_context
epochs = 10

#Plot accuracy and loss for each epoch on a graph
def show_graph(history):

    # Setting Parameters
    acc = history.history['acc']
    val_acc = history.history['val_acc']
    loss = history.history['loss']
    val_loss = history.history['val_loss']

    epochs = range(len(acc))

    # 1) Accracy Plt
    plt.plot(epochs, acc, 'bo', label='training acc')
    plt.plot(epochs, val_acc, 'b', label='validation acc')
    plt.title('Training and Validation acc')
    plt.legend()

    plt.figure()

    # 2) Loss Plt
    plt.plot(epochs, loss, 'bo', label='training loss')
    plt.plot(epochs, val_loss, 'b', label='validation loss')
    plt.title('Training and Validation loss')
    plt.legend()
    plt.show()

#Get data from a dataset
(X_train, y_train, X_test, y_test) = np.load('Dataset PATH')
X_train = np.array(X_train)
X_train = X_train.astype('float32')
X_train /= 255
y_train = np.array(y_train)
X_test = np.array(X_test)
X_test = X_test.astype('float32')
X_test /= 255
y_test = np.array(y_test)

datagen = image.ImageDataGenerator(
    rotation_range=20,
    width_shift_range=0.2,
    height_shift_range=0.2
)
datagen.fit(X_train)

input_tensor = Input(shape=(64, 64, 3))

#Read VGG16 data
vgg16 = VGG16(include_top=False, weights='imagenet', input_tensor=input_tensor)

top_model = Sequential()
top_model.add(Flatten(input_shape=vgg16.output_shape[1:]))
top_model.add(Dense(256, activation='relu'))
top_model.add(BatchNormalization())
top_model.add(Dropout(0.5))
top_model.add(Dense(2, activation='softmax'))

#vgg16 and top_Concatenate models
model = Model(inputs=vgg16.input, outputs=top_model(vgg16.output))

#Fix the weights up to the 19th layer using the for statement
for layer in model.layers[:19]:
    layer.trainable = False
model.summary()

model.compile(loss='categorical_crossentropy',
              optimizer=SGD(lr=1e-4, momentum=0.9),
              metrics=['accuracy'])

history = model.fit_generator(datagen.flow(X_train, y_train, batch_size=32),
                    steps_per_epoch=len(X_train)/32, epochs=epochs, validation_data=(X_test, y_test))

#Evaluation of accuracy
scores = model.evaluate(X_test, y_test, verbose=1)
print('Test loss:', scores[0])
print('Test accuracy:', scores[1])

The data acquired from the dataset this time is (64 * 64) color image data and its label. X_train and X_test are image data, and y_train and y_test are labels. Looking at the contents of X_train,

print(X_train.shape)
print(X_train[0])

(1547, 64, 64, 3)
[[[ 36  40  50]
  [ 40  46  59]
  [ 57  64  82]
  ...
  [114 120 124]
  [161 155 152]
  [141 118 109]]

 ...

 [[203 146 115]
  [210 154 123]
  [182 128  95]
  ...
  [249 250 248]
  [251 243 241]
  [228 212 213]]]

It looks like this. This means that X_train contains 1547 (64 * 64) color images. The value stored in X_train is an 8-bit unsigned integer type from 0 to 255. I divide this by 255 to make it a number that fits between 0 and 1 to reduce the learning cost.

If you check the y_train data in the same way,

print(y_train.shape)
print(y_train[0])

(1547, 2)
[0 1]

You can see that the same number of one-hot-vectors as X_train are stored.

Then load the VGG16 model. Please refer to Keras Applications documentation for loading the VGG16 model. This time, 3 arguments are set. include_top indicates whether or not to include the three fully connected layers on the output layer side of the network. Since we are doing transfer learning this time, we do not need the fully connected layer of VGG16. Therefore, it is set to False. weights determine the weight of VGG16. If it is None, it will be random, and if it is'imagenet', it will be the trained weight. input_tensor specifies the size of the input image.

top_model is the part of the fully connected layer that sticks after VGG16. VGG16 and top_model are integrated into a model called model. And this time, I want to use the weight of VGG16 as it is learned in imagenet, so I am writing the following code.

#Fix the weights up to the 19th layer using the for statement
for layer in model.layers[:19]:
    layer.trainable = False

5. Use Google Colaboratory

Even if you say that you build a machine learning model by transfer learning, if the specifications of the personal computer you are using are low, the number of data is large, or the number of epochs is large, learning will take time. In such a case, let's create a machine learning model using Google Colabratory. Google Colabratory is a jupyter notebook environment provided by Google. The biggest advantage of using this service is that it can be processed at high speed using the GPU. I think that the processing time will differ by about 10 times depending on the presence or absence of GPU. So, below are the steps required to execute the source code written above in Google Colaboratory.

First open Google Colaboratory, give it a name, and then change the settings to use the GPU. To enable the GPU, click the "Edit-> Notebook Settings" button just below the file name in the upper left. Then, the following screen will appear, so select "GPU" there. スクリーンショット 2020-02-01 21.49.34.png The notebook is now ready for GPU. Then type the following code to change the version of numpy.

pip install numpy==1.16.1

The reason for writing this code is to prevent errors when reading the dataset in the source code above. Then upload the dataset on your notebook. To upload, link Google Colaboratory and load the uploaded file to Google Drive. To make it work, type the following code.

from google.colab import drive
drive.mount('/content/gdrive')

When you enter the code, the area where you enter the URL and authentication code will appear. スクリーンショット 2020-02-01 22.00.29.png When you click the link and log in to Google Drive, the password for authentication will be displayed. Copy it and enter the authentication code to use the data on Google Drive. The directory structure on the notebook is (./gdrive/MyDrive/). The file saved in my Google Drive is saved there.

6. At the end

This time, we created a machine learning model by performing transfer learning using our own data set. Even an amateur can easily write code in machine learning, but there is a possibility that the understanding of the written code will be weakened by that amount, so I think it is important to look back on the code I wrote in this way. I will. Next time, I will use the machine learning model created this time to create a web application that distinguishes the faces of Japanese and foreigners.

References

Keras Documentatino-Application How to use Google Colabratory I tried using the free GPU environment of Google Colaboratory What is transfer learning? How do you do the expected "transfer learning" in deep learning?

Recommended Posts

[Machine learning] Create a machine learning model by performing transfer learning with your own data set
Create a python machine learning model relearning mechanism with mlflow
A story about data analysis by machine learning
xgboost: A valid machine learning model for table data
Create a machine learning environment from scratch with Winsows 10
Data set for machine learning
Create a machine learning app with ABEJA Platform + LINE Bot
Reinforcement learning 23 Create and use your own module with Colaboratory
Inversely analyze a machine learning model
Have Hisako's guitar replaced with her own guitar by machine learning -Execution-
How to create a serverless machine learning API with AWS Lambda
[Reinforcement learning] DQN with your own library
Create your own DNS server with Twisted
Machine learning imbalanced data sklearn with k-NN
Create a model for your Django schedule
Put your own image data in Deep Learning and play with it
Create your own Composite Value with SQLAlchemy
A story about machine learning with Kyasuket
How to create a face image data set used in machine learning (1: Acquire candidate images using WebAPI service)
Try to draw a "weather map-like front" by machine learning based on weather data (5)
Machine learning beginners tried to make a horse racing prediction model with python
[For recording] Keras image system Part 1: How to create your own data set?
Try to draw a "weather map-like front" by machine learning based on weather data (1)
Try to draw a "weather map-like front" by machine learning based on weather data (4)
Try to draw a "weather map-like front" by machine learning based on weather data (2)
Machine Learning with docker (40) with anaconda (40) "Hands-On Data Science and Python Machine Learning" By Frank Kane
Classify machine learning related information by topic model
I started machine learning with Python Data preprocessing
Create a wheel of your own OpenCV module
Build a Python machine learning environment with a container
How to set up a Google Colab environment with Coursera's advanced machine learning courses
Collect machine learning training image data on your own (Google Custom Search API Pikachu)
How to quickly create a machine learning environment using Jupyter Notebook with UbuntuServer 16.04 LTS
Collect machine learning training image data on your own (Tumblr API Yoshioka Riho ed.)
Until you create a machine learning environment with Python on Windows 7 and run it
Memo to create your own Box with Pepper's Python
Create your own Big Data in Python for validation
Run a machine learning pipeline with Cloud Dataflow (Python)
Create a Python (Django) learning environment with Docker so that you can debug with VS Code (almost your own procedure memo)
Let's feel like a material researcher with machine learning
Build a machine learning application development environment with Python
Machine Learning with Caffe -1-Category images using reference model
The procedure from generating and saving a learning model by machine learning, making it an API server, and communicating with JSON from a browser
Time series data prediction by AutoML (automatic machine learning)
Create a 3D model viewer with PyQt5 and PyQtGraph
How to quickly create a machine learning environment using Jupyter Notebook with UbuntuServer 16.04 LTS with anaconda
Matching app I tried to take statistics of strong people & tried to create a machine learning model
Try to make a blackjack strategy by reinforcement learning (③ Reinforcement learning in your own OpenAI Gym environment)
[For recording] Keras image system Part 2: Make judgment by CNN using your own data set
A story about developing a machine learning model while managing experiments and models with Azure Machine Learning + MLflow
Create your own exception
Collect machine learning data by scraping from bio-based public databases
Introduction to Machine Learning with scikit-learn-From data acquisition to parameter optimization
Build a machine learning scikit-learn environment with VirtualBox and Ubuntu
Create applications, register data, and share with a single email
[Learning record] Create a mysterious dungeon game with Pyhton's Tkinter
Create AI to identify Zuckerberg's face by deep learning ③ (Data learning)
Support Kaggle / MNIST Do your best with a vector machine
I tried to divide with a deep learning language model
How to quickly create a machine learning environment using Jupyter Notebook on macOS Sierra with anaconda