Try function optimization with Optuna

Introduction

Optuna is an auto-optimization framework for hyperparameters. It seems to be used mainly for hyperparameter tuning of machine learning. Official homepage

Preparation

First, let's install the library. You can install it with pip install optuna </ font>.

Experiment

This time

x^2+y^2+z^2

Let's optimize the minimization problem.

Definition of objective function

First, define the objective function.

#Set objective function(This time x^2+y^2+z^2)
def objective(trial):
    #Set parameters to optimize
    param = {
        'x': trial.suggest_int('x', -100, 100),
        'y': trial.suggest_int('y', -100, 100),
        'z': trial.suggest_int('z', -100, 100)
    }
    #Returns the evaluation value(It is designed to be minimized by default)
    return param['x'] ** 2 + param['y'] ** 2 + param['z'] ** 2

Optimization execution

First, create a study object and then perform optimization. You can set the number of searches with n_trials, which is an argument of optimze ().

#study object creation
study = optuna.create_study()
#Optimization execution
study.optimize(objective, n_trials=500)

When executed, the following display will be displayed. (Excerpt)

[I 2019-12-01 23:01:21,564] Finished trial#381 resulted in value: 121.0. Current best value is 4.0 with parameters: {'x': 0, 'y': 0, 'z': 2}.
[I 2019-12-01 23:01:21,705] Finished trial#382 resulted in value: 56.0. Current best value is 4.0 with parameters: {'x': 0, 'y': 0, 'z': 2}.
[I 2019-12-01 23:01:21,866] Finished trial#383 resulted in value: 88.0. Current best value is 4.0 with parameters: {'x': 0, 'y': 0, 'z': 2}.
[I 2019-12-01 23:01:22,012] Finished trial#384 resulted in value: 104.0. Current best value is 4.0 with parameters: {'x': 0, 'y': 0, 'z': 2}.
[I 2019-12-01 23:01:22,170] Finished trial#385 resulted in value: 426.0. Current best value is 4.0 with parameters: {'x': 0, 'y': 0, 'z': 2}.
[I 2019-12-01 23:01:22,361] Finished trial#386 resulted in value: 5249.0. Current best value is 4.0 with parameters: {'x': 0, 'y': 0, 'z': 2}.
[I 2019-12-01 23:01:22,523] Finished trial#387 resulted in value: 165.0. Current best value is 4.0 with parameters: {'x': 0, 'y': 0, 'z': 2}.
[I 2019-12-01 23:01:22,684] Finished trial#388 resulted in value: 84.0. Current best value is 4.0 with parameters: {'x': 0, 'y': 0, 'z': 2}.

If you want to see the optimized parameters, add the following:

print(study.best_params)

If you want to check the optimized objective function value, add the following.

print(study.best_value)

Also, if you want to see each trial, you can get the information from study.trials. You can display the number of trials, parameters, and objective function values with the following code.

for i in study.trials:
    print(i.number, i.params, i.value)

code

I will put the code used this time.

# -*- coding: utf-8 -*-
import optuna
import matplotlib.pyplot as plt

#Set objective function(This time x^2+y^2+z^2)
def objective(trial):
    #Set parameters to optimize
    param = {
        'x': trial.suggest_int('x', -100, 100),
        'y': trial.suggest_int('y', -100, 100),
        'z': trial.suggest_int('z', -100, 100)
    }
    #Returns the evaluation value(It is designed to be minimized by default)
    return param['x'] ** 2 + param['y'] ** 2 + param['z'] ** 2

if __name__ == '__main__':
    #study object creation
    study = optuna.create_study()
    #Optimization execution
    study.optimize(objective, n_trials=500)

    epoches = []    #For storing the number of trials
    values = []    # best_For storing value
    best = 100000    #Store the maximum value appropriately
    #do the best update
    for i in study.trials:
        if best > i.value:
            best = i.value
        epoches.append(i.number+1)
        values.append(best)
    
    #Graph settings, etc.
    plt.plot(epoches, values, color="red")
    plt.title("optuna")
    plt.xlabel("trial")
    plt.ylabel("value")
    plt.show()

result

The figure of the result of this experiment is as follows. Since the value of best_value was 3.0, the true optimal solution was not reached, but it was confirmed that it was converging at an early stage. optuna2.png

Recommended Posts

Try function optimization with Optuna
Try function optimization using Hyperopt
Try scraping with Python.
Road installation with optimization
Getting Started with Optimization
Try SNN with BindsNET
[Verification] Try to align the point cloud with the optimization function of pytorch Part 1
Try regression with TensorFlow
Try implementing MetaTrader's LWMA with scipy's FIR filter function
Try to specify the axis with PyTorch's Softmax function
Try to factorial with recursion
Implement login function with django-allauth
Try deep learning with TensorFlow
Try using PythonTex with Texpad.
Try edge detection with OpenCV
Try implementing RBM with chainer.
Try Google Mock with C
Approximate sin function with TensorFlow
Try using matplotlib with PyCharm
Try programming with a shell!
Try GUI programming with Hy
Try an autoencoder with Pytorch
Try Python output with Haxe 3.2
Try matrix operation with NumPy
Try implementing XOR with PyTorch
Grouping games with combinatorial optimization
Try running CNN with ChainerRL
Try various things with PhantomJS
Restore disjointed photos with optimization!
[Optimization problem] Optuna vs Hyperopt
Try creating a CRUD function
Try Deep Learning with FPGA
Combinatorial optimization with quantum annealing
Try Black Box Optimization Techniques (Bayesian Optimization: Optuna, Genetic Programming: TPOT)
General-purpose global optimization with Z3
Adjusting LightGBM parameters with Optuna
Try running Python with Try Jupyter
Try implementing perfume with Go
Try Selenium Grid with Docker
Try face recognition with Python
Try OpenCV with Google Colaboratory
Zura with softmax function implemented
Try machine learning with Kaggle
Adjust hyperparameters with Bayesian optimization
Try encryption / decryption using OpenSSL key with Python3 pow function
Try TensorFlow MNIST with RNN
Try building JupyterHub with Docker
Try using folium with anaconda
Try to solve the function minimization problem using particle swarm optimization
Try Deep Learning with FPGA-Select Cucumbers
Try scraping with Python + Beautiful Soup
Reinforcement learning 13 Try Mountain_car with ChainerRL.
Solving 4-color problems with combinatorial optimization
Function parameters with only an asterisk'*'
Maximize restaurant sales with combinatorial optimization
Multilayer Perceptron with Chainer: Function Fitting
Try to operate Facebook with Python
Go see whales with combinatorial optimization
Pave the road with combinatorial optimization
Try singular value decomposition with Python
Try deep learning with TensorFlow Part 2