How to use Optuna, a framework for optimizing hyperparameters. It was created by Preferred Networks.
Official tutorial https://optuna.readthedocs.io/en/stable/tutorial/first.html API reference https://optuna.readthedocs.io/en/stable/reference/index.html
Define the objective function and search space.
Next, build a study
instance and perform optimization with the ʻoptimize` method.
import optuna
#Definition of objective function
def objective(trial):
#Describe the search space here
return ***Objective function**Can be maximized or minimized***
The exploration space is
Set with trial.suggest_
.
Pass it as an integer, category, etc. as follows.
In suggest_categorical (name, choices)
, in the example, select 3 for the parameter'kernel'.
suggest_int (name, low, high, step = 1, log = False)
gives the value of an integer parameter.
suggest_uniform (name, low, high)
gives a linear, continuous value between low
and high
suggest_loguniform (name, low, high)
gives continuous values in logarithmic intervals.
suggest_float (name: str, low: float, high: float, *, step: Optional [float] = None)
will contain the value of the floating point parameter.
With suggest_discrete_uniform (name, low, high, q)
, the value is sampled from the range low
, high
, and the discretization number is given in step q.
def objective(trial):
#How to give a search space
# Categorical parameter
kernel = trial.suggest_categorical('kernel', ['linear', 'poly', 'rbf'])
# Int parameter
num_layers = trial.suggest_int('num_layers', 1, 3)
# Uniform parameter
dropout_rate = trial.suggest_uniform('dropout_rate', 0.0, 1.0)
# Loguniform parameter
learning_rate = trial.suggest_loguniform('learning_rate', 1e-5, 1e-2)
# Discrete-uniform parameter
drop_path_rate = trial.suggest_discrete_uniform('drop_path_rate', 0.0, 1.0, 0.1)
#Automatic optimization of hyperparameters
study = optuna.create_study()
#n_Sets the number of iterations to search in trials.
study.optimize(objective, n_trials = 100)
Recommended Posts