I tried to organize SVM.

・ About SVM (Support Vector Machine) ・ Finally, as a personal hobby, use it in computer shogi. I tried together.

1. 1. SVM overview

SVM:Understanding Support Vector Machine algorithm from examples (along with code) http://www.analyticsvidhya.com/blog/2015/10/understaing-support-vector-machine-example-code/?utm_content=buffer4951f&utm_medium=social&utm_source=twitter.com&utm_campaign=buffer

Although it is an English article, this is my personal recommendation. Graphically explaining the concept of SVM without explanation by mathematical formulas, I think it's very easy to get along with.

・ Classification is performed by drawing the boundary surface. -Since no mean or variance is used, even if data is inserted or changed, No need for overall recalculation You can see that.

2. About the method called SVM

Summarizes PRML (Pattern Recognition and Machine Learning) I already have a great article, see that.

1). Linear SVM

http://aidiary.hatenablog.com/entry/20100501/1272712699

The above does not strictly define how to draw the boundary surface, An optimization problem that maximizes the margin (the shortest distance between the classification boundary and the training data). You can see that we should solve the dual problem of the convex planning problem.

2). Non-linear SVM

http://aidiary.hatenablog.com/entry/20100502/1272804952

By making the margin a non-linear function, more complex classification is possible. However, if there is still overlap, classification is difficult.

3). Soft margin SVM

http://aidiary.hatenablog.com/entry/20100503/1272889097

So far, it is called a hard margin SVM, It was a technique that assumed that the data could be completely separated in the input space x.

On the other hand, the soft margin SVM is used to overcome the classification when there is an overlap as described above. This is a method of penalizing misclassification.

The more misclassifications there are, the more penalties are added to prevent minimization. As a result, try to minimize misclassification as much as possible. It will try to adjust the parameters.

The higher the penalty, the more accurate the classification. No matter how fine the classification boundaries are ...

3. 3. Utilization of SVM in computer shogi

Checkmate prediction by SVM and its application Makoto Miwa, Department of Fundamental Informatics, Graduate School of Frontier Sciences, The University of Tokyo (at that time) http://repository.dl.itc.u-tokyo.ac.jp/dspace/bitstream/2261/187/1/K-00177.pdf

SVMs are used to predict checkmate in shogi.

In shogi, it is very important to judge whether there is a checkmate, "By using checkmate prediction using SVM, we were able to reduce the search time to 62%." That is the purpose of this paper.

For computer shogi ・ Application of machine learning to evaluation function ・ Search algorithm to search for aspects It is written in detail about, so I think it will be very educational.

In "2. About the method called SVM", I explained step by step. Practically, from the calculation cost and simplicity, it is approximately 1. I think the linear SVM of is enough.

Some studies apply non-linear SVMs, but I personally think this is counterproductive. The accuracy obtained is very small for the calculation cost.


Supplement: Utilization of machine learning in computer shogi and go http://www.slideshare.net/TakashiKato2/ss-57966067

This is what I raised, but see here for how to use machine learning in computer shogi.

Recommended Posts

I tried to organize SVM.
I tried to organize about MCMC.
I tried to debug.
I tried to paste
I tried to learn PredNet
I tried to implement PCANet
I tried to reintroduce Linux
I tried to introduce Pylint
I tried to summarize SparseMatrix
I tried to touch jupyter
I tried to implement StarGAN (1)
I tried to implement Deep VQE
I tried to create Quip API
I tried to touch Python (installation)
I tried to implement adversarial validation
I tried to explain Pytorch dataset
I tried Watson Speech to Text
I tried to touch Tesla's API
I tried to implement hierarchical clustering
I tried to implement Realness GAN
I tried to move the ball
I tried to estimate the interval.
I tried to create a linebot (implementation)
I tried to implement Autoencoder with TensorFlow
I tried to summarize the umask command
I tried to implement permutation in Python
I tried to create a linebot (preparation)
I tried to visualize AutoEncoder with TensorFlow
I tried scraping
I tried to recognize the wake word
I tried to get started with Hy
I tried PyQ
I tried to implement PLSA in Python 2
[Updated as appropriate] I tried to organize the basic visualization methods
Python3 standard input I tried to summarize
I tried to classify text using TensorFlow
I tried AutoKeras
I tried to summarize the graphical modeling.
I tried adding post-increment to CPython Implementation
I tried to implement ADALINE in Python
I tried to let optuna solve Sudoku
I tried to estimate the pi stochastically
I tried to touch the COTOHA API
I tried to implement PPO in Python
I tried papermill
I tried to implement CVAE with PyTorch
I tried to make a Web API
I tried to solve TSP with QAOA
[Python] I tried to calculate TF-IDF steadily
I tried to touch Python (basic syntax)
I tried django-slack
I tried Django
I tried spleeter
I tried cgo
I tried my best to return to Lasso
I tried to summarize Ansible modules-Linux edition
I tried to predict Covid-19 using Darts
I tried to predict next year with AI
I tried to build a super-resolution method / ESPCN
I tried to program bubble sort by language
I tried web scraping to analyze the lyrics.