https://github.com/fmfn/BayesianOptimization
I tried using this Bayesian optimization. I needed a little ingenuity, so I'll leave it.
optimizer.maximize(
init_points=2,
n_iter=3,
)
From the README.
It seemed that only maximize
was provided as an optimization method. What I wanted to do was optimize the search ranking (I want to hit at the top of the ranking), so I simply added a minus to the ranking and optimized it.
optimizer = BayesianOptimization(
f=foo,
pbounds=pbounds,
)
It means that the return value of this foo
is negative. There may be a way on the maximize
or ʻoptimizer` side, but I couldn't find it.
def foo(x, y):
return -x ** 2 - (y - 1) ** 2 + 1
This is the function to be optimized.
pbounds = {'x': (2, 4), 'y': (-3, 3)}
It is a mechanism that gives a range of arguments like this and goes to find the optimum value in it. At my point, I wanted to give another argument like foo (x, y, z)
, and if I let maximize
feed it as it is, an error occurred. As a countermeasure, I made z
a global variable and forcibly passed it. I don't know if it's correct.
This has nothing to do with Bayesian optimization, but when deciding the optimal composition ratio of x and y, I first wrote as follows.
def foo(a):
return a*x+y
so,
pbounds = {'a': (0, 100)}
What did you do? It's no good.
def foo(a):
return a*x+(100-a)*y
thus
pbounds = {'a': (0, 100)}
This worked.
Recommended Posts