Recently, I have been blessed with the opportunity to come into contact with "mathematical optimization" because of my work. It's a good idea, so I'd like to post the results of a little private touch.
What I did was an example of mathematical optimization, the "traveling salesman problem" Since this can be implemented with the free environment pulp, I tried to implement it while preparing the environment.
The problems I want to try are as follows.
◆ 2.12 Traveling salesman problem https://www.msi.co.jp/nuopt/docs/v19/examples/html/02-12-00.html
I drew a picture briefly below. A, B, C, D are the points on the map.
Which route has the smallest sum of distances when you leave A, write a stroke, and return? It seems that the optimization problem is called the "traveling salesman problem". It's called NP-hard, and it seems to be quite difficult.
However, the approach with the modeler called pulp that is touched in the python environment is free !!! It is easy to do, so I will refer to the following article.
◆ Mathematical optimization starting from the traveling salesman problem https://qiita.com/panchovie/items/6509fb54e3d53f4766aa
The great thing is that it has everything from installing pulp to sample code. Let's put the first setting in the sample code immediately. What about the following code that seems to compile for the time being?
import pulp as pp
#Definition of optimization model
mip_model = pp.LpProblem("tsp_mip", pp.LpMinimize)
N = 4
BigM = 10000
c_ = [
[0,6,5,5], # pointA > [pointA,pointB,pointC,pointD]
[6,0,7,4], # pointB > [pointA,pointB,pointC,pointD]
[5,7,0,3], # pointC > [pointA,pointB,pointC,pointD]
[5,4,3,0], # pointD > [pointA,pointB,pointC,pointD]
]
#Variable definition
x = [[pp.LpVariable("x(%s,%s)"%(i, j), cat="Binary") for i in range(N)] for j in range(N)]
u = [pp.LpVariable("u(%s)"%(i), cat="Continuous", lowBound=1.0, upBound=(N)) for i in range(N)]
#Definition & registration of evaluation index (Equation (1))
objective = pp.lpSum(c_[i][j] * x[i][j] for i in range(N) for j in range(N) if i != j)
mip_model += objective
#Conditional expression(2)Registration of
for i in range(N):
mip_model += pp.lpSum(x[i][j] for j in range(N) if i != j) == 1
#Conditional expression(3)Registration of
for i in range(N):
mip_model += pp.lpSum(x[j][i] for j in range(N) if i != j) == 1
for i in range(N):
mip_model += x[i][i] == 0
#Conditional expression(4) (MTZ constraints)
for i in range(N):
for j in range(N):
if i != j:
mip_model += u[i] + 1.0 - BigM * (1.0 - x[i][j]) <= u[j]
#Perform optimization
status = mip_model.solve()
#Understanding the results
print("Status: {}".format(pp.LpStatus[status]))
print("Optimal Value [a.u.]: {}".format(objective.value()))
for i in range(N):
for j in range(N):
if i != j:
print("x[%d][%d]:%f" % (i,j,x[i][j].value()))
for i in range(len(u)):
print("u[%d] %f" % (i,u[i].value()))
If you execute it in this state ...
Status: Infeasible
Optimal Value [a.u.]: 18.0
x[0][1]:0.000000
x[0][2]:1.000000
x[0][3]:0.000000
x[1][0]:0.999800
x[1][2]:0.000000
x[1][3]:0.000200
x[2][0]:0.000200
x[2][1]:0.000000
x[2][3]:0.999800
x[3][0]:0.000000
x[3][1]:1.000000
x[3][2]:0.000000
It seems that it has not been calculated well. It seems that the constraint conditions are not satisfied well, so after various examinations, it seems that the MTZ constraint formula is strange as it is. If you write all four formulas for x to u in the final state, it will be as follows.
u[0] < u[2]\\
u[1] < u[0]\\
u[2] < u[3]\\
u[3] < u[1]
When it comes to satisfying all of this,
u[0] < u[2] < u[3] < u[1] < u[0] < ...
It will increase infinitely, but since the effective range of u is between 1 and 4, it seems that there is a contradiction. So, finally, by removing the constraint on the variable that returns to point A, let's reconstruct the equation as follows.
u[0] < u[2]\\
u[2] < u[3]\\
u[3] < u[1]
This will change the implementation and change the MTZ condition formula to:
#Conditional expression(4) (MTZ constraints)
for i in range(N):
for j in range(1,N):
if i != j:
mip_model += u[i] + 1.0 - BigM * (1.0 - x[i][j]) <= u[j]
The source code is summarized below.
sample_route.py
import pulp as pp
#Definition of optimization model
mip_model = pp.LpProblem("tsp_mip", pp.LpMinimize)
N = 4
BigM = 10000
c_ = [
[0,6,5,5], # pointA > [pointA,pointB,pointC,pointD]
[6,0,7,4], # pointB > [pointA,pointB,pointC,pointD]
[5,7,0,3], # pointC > [pointA,pointB,pointC,pointD]
[5,4,3,0], # pointD > [pointA,pointB,pointC,pointD]
]
#Variable definition
x = [[pp.LpVariable("x(%s,%s)"%(i, j), cat="Binary") for i in range(N)] for j in range(N)]
u = [pp.LpVariable("u(%s)"%(i), cat="Continuous", lowBound=1.0, upBound=(N)) for i in range(N)]
#Definition & registration of evaluation index (Equation (1))
objective = pp.lpSum(c_[i][j] * x[i][j] for i in range(N) for j in range(N) if i != j)
mip_model += objective
#Conditional expression(2)Registration of
for i in range(N):
mip_model += pp.lpSum(x[i][j] for j in range(N) if i != j) == 1
#Conditional expression(3)Registration of
for i in range(N):
mip_model += pp.lpSum(x[j][i] for j in range(N) if i != j) == 1
for i in range(N):
mip_model += x[i][i] == 0
#Conditional expression(4) (MTZ constraints)
for i in range(N):
for j in range(1,N):
if i != j:
mip_model += u[i] + 1.0 - BigM * (1.0 - x[i][j]) <= u[j]
#Perform optimization
status = mip_model.solve()
#Understanding the results
print("Status: {}".format(pp.LpStatus[status]))
print("Optimal Value [a.u.]: {}".format(objective.value()))
for i in range(N):
for j in range(N):
if i != j:
print("x[%d][%d]:%f" % (i,j,x[i][j].value()))
for i in range(len(u)):
print("u[%d] %f" % (i,u[i].value()))
The result is as follows.
Status: Optimal
Optimal Value [a.u.]: 18.0
x[0][1]:0.000000
x[0][2]:1.000000
x[0][3]:0.000000
x[1][0]:1.000000
x[1][2]:0.000000
x[1][3]:0.000000
x[2][0]:0.000000
x[2][1]:0.000000
x[2][3]:1.000000
x[3][0]:0.000000
x[3][1]:1.000000
x[3][2]:0.000000
u[0] 1.000000
u[1] 4.000000
u[2] 2.000000
u[3] 3.000000
It converged safely. The results are not graphical,
Point A → Point C → Point D → Point B → Point A
Seems to be the shortest, and the distance is 18. Where the problem was listed,
Point A → Point B → Point D → Point C → Point A
So, after all, the distance is 18 and it seems to be the shortest, so there are some answers.
Mathematical optimization is very fresh only in areas that we haven't touched on before. There is an environment where you can feel free to try it, so I will take this opportunity to study various things.
Recommended Posts