I tried the common story of predicting the Nikkei 225 using deep learning (backtest)

change history

** 2016/8/14: Added the result when the signals of the three methods are synthesized. ** ** ** 2016/8/14: Added win rate. ** **

Overview

Last time (I tried the common story of using Deep Learning to predict the Nikkei 225), the return of the Nikkei 225 stocks on the previous day I tried to predict the rise and fall of the Nikkei Stock Average the next day using. This time, for the time being, it is an operation back test edition, so I tried to simulate what would happen if it was actually operated using the predicted return.

Disc breaker

As mentioned in the previous article, I am not responsible for any damages caused by actual operation using this method.

Operational premise

Originally, it is necessary to consider commissions, trading time, trading turnover, etc., but this time, we simply said, "Forecast at the closing price on the previous day, make a trading decision momentarily, and momentarily Nikkei 225. It makes a very unrealistic assumption that you can buy or sell. (Well, isn't it because the calculation is troublesome?)

The summary is as follows.

Last result

The previous result was as follows. I tried three methods: Random Forest, Multilayer Perceptron (MLP), and Convolutional Neural Network (CNN).

Kobito.B8RCPa.png

Kobito.8YIHGQ.png

Kobito.VOx3BE.png

From an AUC perspective, the order is CNN> MLP> RF.

Operational backtest

The backtest period will be daily from 2008 to the latest. The first is 1000, and each is indexed. (Left axis) It also shows the difference against the Nikkei 225 index. (Right axis)

RF simulation results

RF運用.png

MLP simulation results

MLP運用.png

CNN simulation results

CNN運用.png

Comparison of three methods

三手法.png

Results from synthetic signals of CNN, MLP, RF (Added 2016/08/14)

CNN is the best model performance, so let's synthesize RF and MLP around this. As a conditional expression

signal = (CNN signal) & (RF signal|MLP signal)

It feels like. In other words, only if you say that the CNN goes up and that both of the other two models go up, you decide that the final signal goes up.

合成.png

Various statistics (added on August 14, 2016)

RF MLP CNN Synthetic Nikkei
Annual return(%) 5.6% 8.1% 9.2% 11.7% 1.5%
Annual standard deviation(%) 20.1% 20.5% 19.9% 18.6% 27.2%
Risk return 0.277 0.395 0.461 0.628 0.054
Cumulative win rate 54% 55% 54% 56% 52%

The result is synthetic> CNN> MLP> RF (in terms of risk / return). Even if the Nikkei average is Long Hold, it is a 52% win, so it seems that the rise and fall during this period is slightly closer to the uptrend. However, among them, all methods are superior to that, and 56% for synthesis is a fairly good number.

Summary

Recommended Posts

I tried the common story of predicting the Nikkei 225 using deep learning (backtest)
I tried the common story of using Deep Learning to predict the Nikkei 225
I tried using the trained model VGG16 of the deep learning library Keras
The story of doing deep learning with TPU
I tried deep learning
I tried to extract and illustrate the stage of the story using COTOHA
I tried using the image filter of OpenCV
I tried running an object detection tutorial using the latest deep learning algorithm
I tried using the API of the salmon data project
I tried to compress the image using machine learning
I tried face recognition of the laughter problem using Keras.
I tried hosting a TensorFlow deep learning model using TensorFlow Serving
[TF] I tried to visualize the learning result using Tensorboard
[Machine learning] I tried to summarize the theory of Adaboost
[Python] I tried collecting data using the API of wikipedia
I tried to compare the accuracy of machine learning models using kaggle as a theme.
I tried using GrabCut of OpenCV
I tried reinforcement learning using PyBrain
I tried using the checkio API
[Pokemon Sword Shield] I tried to visualize the judgment basis of deep learning using the three family classification as an example.
I tried to get the index of the list using the enumerate function
I looked at the meta information of BigQuery & tried using it
[Anomaly detection] Try using the latest method of deep distance learning
I tried to visualize the common condition of VTuber channel viewers
Othello-From the tic-tac-toe of "Implementation Deep Learning" (3)
[Kaggle] I tried ensemble learning using LightGBM
Visualize the effects of deep learning / regularization
Othello-From the tic-tac-toe of "Implementation Deep Learning" (2)
I tried using the BigQuery Storage API
I tried to transform the face image using sparse_image_warp of TensorFlow Addons
I tried SIGNATE "[Practice question] Predicting the number of rental bicycle users"
I tried to get the batting results of Hachinai using image processing
I tried calling the prediction API of the machine learning model from WordPress
zoom I tried to quantify the degree of excitement of the story at the meeting
I tried to estimate the similarity of the question intent using gensim's Doc2Vec
I tried hosting Pytorch's deep learning model using TorchServe on Amazon SageMaker
Using COTOHA, I tried to follow the emotional course of Run, Melos!
I tried to predict the deterioration of the lithium ion battery using the Qore SDK
I tried to notify the update of "Hamelin" using "Beautiful Soup" and "IFTTT"
I tried using scrapy for the first time
I tried cluster analysis of the weather map
The story of low learning costs for Python
An amateur tried Deep Learning using Caffe (Introduction)
vprof --I tried using the profiler for Python
[Python] I tried to judge the member image of the idol group using Keras
An amateur tried Deep Learning using Caffe (Practice)
[Python] Deep Learning: I tried to implement deep learning (DBN, SDA) without using a library.
I tried using PyCaret at the fastest speed
I tried using the Google Cloud Vision API
I tried to touch the API of ebay
I tried to correct the keystone of the image
I tried to predict the presence or absence of snow by machine learning.
[Python] I tried to analyze the characteristics of thumbnails that are easy to play on YouTube by deep learning
I tried to understand the learning function of neural networks carefully without using a machine learning library (first half).
The story of making soracom_exporter (I tried to monitor SORACOM Air with Prometheus)
I tried using the functional programming library toolz
I tried to predict the price of ETF
I tried to vectorize the lyrics of Hinatazaka46!
I tried to predict the victory or defeat of the Premier League using the Qore SDK
I tried to notify the update of "Become a novelist" using "IFTTT" and "Become a novelist API"
Python practice 100 knocks I tried to visualize the decision tree of Chapter 5 using graphviz