Optuna is an auto-optimization framework for hyperparameters. It seems to be used mainly for hyperparameter tuning of machine learning. Official homepage
First, let's install the library. You can install it with pip install optuna </ font>.
This time
x^2+y^2+z^2
Let's optimize the minimization problem.
First, define the objective function.
#Set objective function(This time x^2+y^2+z^2)
def objective(trial):
#Set parameters to optimize
param = {
'x': trial.suggest_int('x', -100, 100),
'y': trial.suggest_int('y', -100, 100),
'z': trial.suggest_int('z', -100, 100)
}
#Returns the evaluation value(It is designed to be minimized by default)
return param['x'] ** 2 + param['y'] ** 2 + param['z'] ** 2
First, create a study object and then perform optimization. You can set the number of searches with n_trials, which is an argument of optimze ().
#study object creation
study = optuna.create_study()
#Optimization execution
study.optimize(objective, n_trials=500)
When executed, the following display will be displayed. (Excerpt)
[I 2019-12-01 23:01:21,564] Finished trial#381 resulted in value: 121.0. Current best value is 4.0 with parameters: {'x': 0, 'y': 0, 'z': 2}.
[I 2019-12-01 23:01:21,705] Finished trial#382 resulted in value: 56.0. Current best value is 4.0 with parameters: {'x': 0, 'y': 0, 'z': 2}.
[I 2019-12-01 23:01:21,866] Finished trial#383 resulted in value: 88.0. Current best value is 4.0 with parameters: {'x': 0, 'y': 0, 'z': 2}.
[I 2019-12-01 23:01:22,012] Finished trial#384 resulted in value: 104.0. Current best value is 4.0 with parameters: {'x': 0, 'y': 0, 'z': 2}.
[I 2019-12-01 23:01:22,170] Finished trial#385 resulted in value: 426.0. Current best value is 4.0 with parameters: {'x': 0, 'y': 0, 'z': 2}.
[I 2019-12-01 23:01:22,361] Finished trial#386 resulted in value: 5249.0. Current best value is 4.0 with parameters: {'x': 0, 'y': 0, 'z': 2}.
[I 2019-12-01 23:01:22,523] Finished trial#387 resulted in value: 165.0. Current best value is 4.0 with parameters: {'x': 0, 'y': 0, 'z': 2}.
[I 2019-12-01 23:01:22,684] Finished trial#388 resulted in value: 84.0. Current best value is 4.0 with parameters: {'x': 0, 'y': 0, 'z': 2}.
If you want to see the optimized parameters, add the following:
print(study.best_params)
If you want to check the optimized objective function value, add the following.
print(study.best_value)
Also, if you want to see each trial, you can get the information from study.trials. You can display the number of trials, parameters, and objective function values with the following code.
for i in study.trials:
print(i.number, i.params, i.value)
I will put the code used this time.
# -*- coding: utf-8 -*-
import optuna
import matplotlib.pyplot as plt
#Set objective function(This time x^2+y^2+z^2)
def objective(trial):
#Set parameters to optimize
param = {
'x': trial.suggest_int('x', -100, 100),
'y': trial.suggest_int('y', -100, 100),
'z': trial.suggest_int('z', -100, 100)
}
#Returns the evaluation value(It is designed to be minimized by default)
return param['x'] ** 2 + param['y'] ** 2 + param['z'] ** 2
if __name__ == '__main__':
#study object creation
study = optuna.create_study()
#Optimization execution
study.optimize(objective, n_trials=500)
epoches = [] #For storing the number of trials
values = [] # best_For storing value
best = 100000 #Store the maximum value appropriately
#do the best update
for i in study.trials:
if best > i.value:
best = i.value
epoches.append(i.number+1)
values.append(best)
#Graph settings, etc.
plt.plot(epoches, values, color="red")
plt.title("optuna")
plt.xlabel("trial")
plt.ylabel("value")
plt.show()
The figure of the result of this experiment is as follows. Since the value of best_value was 3.0, the true optimal solution was not reached, but it was confirmed that it was converging at an early stage.
Recommended Posts