Solve optimization problems in Python. If you're looking for a good sample, [Nelder Mead Method](https://ja.wikipedia.org/wiki/%E3%83%8D%E3%83%AB%E3%83%80%E3 % 83% BC% E2% 80% 93% E3% 83% 9F% E3% 83% BC% E3% 83% 89% E6% B3% 95) is implemented in [MATLAB Function fminsearch](https: / /jp.mathworks.com/help/matlab/ref/fminsearch.html) looks good.
MATLAB has a paid option called optimization toolbox that has functions for optimization, but you can use this fminsearch even if only MATLAB itself is installed.
a = sqrt(2);
banana = @(x)100*(x(2)-x(1)^2)^2+(a-x(1))^2;
[x,fval] = fminsearch(banana, [-1.2, 1], optimset('TolX',1e-8));
This is exactly the example described on the fminsearch page, but Nelder-Mead X to minimize this formula. Required by law.
Port the above banana function example to Python. To solve this optimization problem in Python, we rely on Scipy. Scipy.optimize In Scipy.optimize, various optimization algorithms are implemented, when you look at the document I understand.
A simple port would look like this:
from scipy.optimize import fmin
import math
banana = lambda X, a: 100*(X[1] - X[0]**2)**2 + (a - X[0])**2
a = math.sqrt(2)
arg = (a, )
fmin(banana, [-1, 1.2], args=arg)
With the initial value of (x 1 </ sub>, x 2 </ sub>) = (-1, 1.2), optimization was performed to minimize the value of the above formula. However, it shows that the result (x 1 </ sub>, x 2 </ sub>) = (1.41420186, 1.99996718) was obtained.
The first important thing here is the definition of the function by lambda
. That said, it's something that comes up every time you do a Python tutorial, and it may not be the point to worry about. Here, it is important to "pass a list called X and a variable called a as arguments", and the variables adjusted by `` `fmin``` are the two variables passed in the list called X, and a is It means that it is passed as a constant. Of course, it may be normally defined as follows.
def banana(X, a):
return 100*(X[1] - X[0]**2)**2 + (a - X[0])**2
Now, the argument of `fmin``` is the objective function to be minimized, the variable to be optimized to minimize the function value, and the optimization by
fmin. It is a tuple of arguments to be input to the function, although it is not the target of. This time, among the arguments to be input to the objective function
banana,
ato be input as a constant is passed as a tuple to the argument
args of `` fmin
. Therefore, ```arg = (a,) `` is used. When defining a tuple with 1 element, it must end with "
,
`".
Various options can be set for optimization. The optimization guy does "tweak the parameters until the value of the function converges", but there are various conditions for judging "converged". The conditions can be set. See Documentation for details.
Now, let's set the maximum number of function evaluations to 400, the maximum number of iterations to 400, the function end value tolerance to 1e-4, and the X tolerance to 1e-4 (MATLAB fminsearch default value), and to explicitly specify the error, call `` `fmin``` as follows.
fmin(banana, [-1, 1.2], args=arg, xtol=1e-4, ftol=1e-4, maxiter=400, maxfun=400)
By the way, if the optimization is about the banana function, it will end in an instant, but there are times when you want to visualize the process of this optimization calculation. In such a case, specify the callback function to retrieve the value for each iteration.
count = 0
def cbf(Xi):
global count
count += 1
print('%d, %f, %f, %f' % (count, Xi[0], Xi[1], banana(Xi, math.sqrt(2))))
I'm not sure if this is the right way to do it, but I tried to count up and display the number of iterations as a global variable.
Instead of outputting the calculation result as text for each iteration, it can be expressed in a graph.
from scipy.optimize import fmin
import math
import matplotlib.pyplot as plt
count = 0
plt.axis([0, 100, 0, 6.5])
plt.ion()
def cbf(Xi):
global count
count += 1
f = banana(Xi, math.sqrt(2))
print('%d, %f, %f, %f' % (count, Xi[0], Xi[1], f))
plt.scatter(count, f)
banana = lambda X, a: 100*(X[1] - X[0]**2)**2 + (a - X[0])**2
a = math.sqrt(2)
arg = (a, )
fmin(banana, [-1, 1.2], args=arg, callback=cbf)
Now you can see how points are plotted in real time on the graph with `` `matplotlib```.
As a result of calculation by fmin, as described above(x<sub>1</sub>, x<sub>2</sub>) = (1.41420186, 1.99996718)People often ask to explain in a little more detail the process by which the result was obtained. No, my boss too(The following is omitted)…。
The `` `fmin``` options for that are the `` `retall``` and `` `full_output``` arguments, and if you set them to `` `True``` You can get various return values of `` `.
```python
[xopt, fopt, iter, funcalls, warnflag, allvecs] = fmin(banana, [-1, 1.2], args=arg, retall=True, full_output=True)
xopt
Is an optimized parameter,fopt
Is the return value of the minimized function at that time.iter
Whenfuncallls
How many iterations were donewarnflag
は「収束した」When判断した条件Is stored.allvecs
には、各iterationで最適化の対象Whenなっている変数The value of the(上述のbanana関数の例でいうWhenx1Whenx2The value of the)Is stored.
So, if you need a history of variable adjustments for each iteration, you can visualize it by graphing it after optimization with `` `fmin``` without processing it in the callback function. ..
So, I tried to do the same thing in Python, citing the example of the fminsearch function in MATLAB.
from scipy.optimize import fmin
import math
import matplotlib.pyplot as plt
def cbf(Xi):
global count
count += 1
f = banana(Xi, math.sqrt(2))
plt.scatter(count, f)
plt.pause(0.05)
def banana(X, a):
return 100*(X[1] - X[0]**2)**2 + (a - X[0])**2
def main():
a = math.sqrt(2)
arg = (a, )
[xopt, fopt, iter, funcalls, warnflag, allvecs] = fmin(
banana,
[-1, 1.2],
args=arg,
callback=cbf,
xtol=1e-4,
ftol=1e-4,
maxiter=400,
maxfun=400,
disp=True,
retall=True,
full_output=True)
for item in allvecs:
print('%f, %f' % (item[0], item[1]))
if __name__ == '__main__':
count = 1
plt.axis([0, 100, 0, 6.5])
main()
Recommended Posts