I am studying the knowledge of genetic algorithms necessary to operate platypus and deap based on the special article of the March 2019 issue of Nikkei Software.
A genetic algorithm is an algorithm that imitates genetic evolution such as "selection (selection)", "crossover", and "mutation". The genetic algorithm is also a randomized algorithm because it uses a lot of random numbers.
The problem to be solved is to satisfy the following two. ・ Problems that can be quantified (evaluated) for input parameters ・ The problem of finding the input that maximizes and minimizes the numerical value
Gene: A component for expressing a trait Chromosome: A collection of multiple genes Individual: A trait expressed by one chromosome (sometimes called individual = chromosome)
"Fitness" is the evaluation value of how suitable an individual is for the environment.
Three operations are performed to increase this fitness. Selection (selection): Leave the one with high fitness (select the one with low fitness) Crossover: Produces a child that inherits a gene from two parents Mutation: Changes an individual's gene with a certain probability.
Crossover is a hallmark of genetic algorithms.
When all of these operations are completed once, the generation advances by one.
The OneMax problem is a problem that finds a list such as [1,0,0,0,1,0,1,1] that maximizes the sum of the list consisting of 1s and 0s. Gene: 0 or 1 Chromosome: [1,0,0,0,1,0,1,1] Fitness: Sum of lists (in this case, [1,0,0,0,1,0,1,1] = 4)
The program is made up of the following elements: parameter settings Functions (calculate fitness, sort populations in order of fitness, selection: leave highly adaptable individuals, cross, mutation) Main processing (generation, selection, crossover, mutation of initial population)
Five are given as parameters. Gene length, population, number of generations, probability of mutation, selection rate
import random
import copy
#Parameters
LIST_SIZE = 10 # 0/List length of 1 (gene length)
POPULATION_SIZE = 10 #Population population
GENERATION = 25 #Number of generations
MUTATE = 0.1 #Mutation probability
SELECT_RATE = 0.5 #Selection ratio
In this example, GENERATION = 25
, so it will end in the 25th generation. In general, it ends when the fitness does not change. If you increase LIST_SIZE
, you may not reach the optimal solution unless you increase POPULATION_SIZE`` GENERATION
. Tuning the probability of mutation, selection rate, etc. is important, but this time it is decided appropriately.
#Calculate fitness
def calc_fitness(individual):#individual is a chromosome
return sum(individual) #Sum of chromosome list elements
#Sort populations in order of fitness
def sort_fitness(population):
#Tuple fitness and individuals and store in list
fp = []
for individual in population:
fitness = calc_fitness(individual)#Calculate fitness and put it in fitness.
fp.append((fitness, individual)) #Put fitness and chromosomes in order in fp
fp.sort(reverse=True) #Sort by applicability(descending order)
sorted_population = [] #Population that stores sorted individuals
for fitness, individual in fp: #Take out an individual and store it in a group
sorted_population.append(individual)
return sorted_population #The applicability and chromosomes sorted in descending order of applicability are stored in this.
#Choice
#Leave a highly adaptable individual
def selection(population):
sorted_population = sort_fitness(population)#sort_Population contains sorted applicability and chromosomes.
#Number of individuals to leave. 0 in parameter setting.Since it is 5, 5 individuals remain at 10 x 0.5
n = int(POPULATION_SIZE * SELECT_RATE)
return sorted_population[0:n]#Five remain from the beginning. 0,1,2,3,4
#Crossing
#First, copy ind1 to create a new individual ind. After that, the randomly determined range of r1 to r2 is replaced with the ind2 gene.
def crossover(ind1, ind2):
r1 = random.randint(0, LIST_SIZE - 1)#Randomly determine the position of r1
r2 = random.randint(r1 + 1, LIST_SIZE)#Randomly determine the position of r2 after r1.
ind = copy.deepcopy(ind1)#Make a copy of ind1
ind[r1:r2] = ind2[r1:r2]#Take a randomly determined range from r1 to r2 from chromosome ind2 and replace it with a copy of ind1 ,.
return ind
#mutation:設定したmutation確率に従ってmutationさせる
def mutation(ind1):
ind2 = copy.deepcopy(ind1)#Make ind2 by copying.
for i in range(LIST_SIZE):
if random.random() < MUTATE:#Randomly make a numerical value, and if it is less than the mutation rate, mutation will occur
ind2[i] = random.randint(0, 1)#Put 0 or 1 where the mutation came
return ind2
#Generate initial population
population = [] #Group
for i in range(POPULATION_SIZE):
individual = [] #Chromosomes
for j in range(LIST_SIZE):
# 0/Add 1 with a random number
individual.append(random.randint(0, 1))
population.append(individual)
for generation in range(GENERATION):
print(str(generation + 1) + u"generation")
#Choice
population = selection(population)
#Repeat crossing and mutation. Population to increase by crossing and mutation
n = POPULATION_SIZE - len(population)
for i in range(n):
#Randomly select 2 individuals from the population
#Generate crossed individuals
r1 = random.randint(0, len(population) - 1)
r2 = random.randint(0, len(population) - 1)
#Crossing
individual = \
crossover(population[r1], population[r2])
#mutation
individual = mutation(individual)
#Add to the group
population.append(individual)
for individual in population:
print(individual)
This is a program that can solve the One-Max problem.
1 generation [1, 1, 1, 0, 1, 1, 0, 0, 1, 1] [1, 0, 1, 1, 0, 0, 1, 1, 1, 0] [1, 0, 1, 0, 1, 1, 0, 1, 1, 0] [1, 0, 1, 1, 1, 0, 0, 0, 0, 1] [0, 0, 1, 1, 0, 1, 1, 0, 1, 0] [1, 1, 1, 0, 1, 1, 0, 0, 1, 0] [0, 1, 1, 0, 1, 1, 0, 0, 1, 0] [0, 1, 1, 0, 1, 1, 1, 0, 1, 0] [1, 0, 1, 1, 0, 1, 0, 0, 1, 0] [1, 0, 1, 1, 1, 1, 0, 1, 1, 0]
In the 14th generation, the optimal solution was found. 14 generations [1, 1, 1, 1, 1, 1, 1, 1, 1, 1] [1, 1, 1, 1, 1, 1, 0, 1, 1, 1] [1, 1, 1, 1, 1, 1, 0, 1, 1, 1] [1, 1, 1, 1, 1, 1, 0, 1, 1, 1] [1, 1, 1, 1, 1, 1, 0, 1, 1, 1] [1, 1, 1, 1, 1, 1, 0, 1, 1, 1] [1, 1, 1, 1, 1, 0, 0, 1, 1, 1] [1, 1, 1, 1, 1, 1, 1, 1, 1, 1] [1, 1, 1, 1, 1, 0, 0, 1, 1, 1] [1, 1, 1, 1, 1, 1, 0, 1, 1, 1]
In the 20th generation, up to 6 solutions have become optimal. 20 generations [1, 1, 1, 1, 1, 1, 1, 1, 1, 1] [1, 1, 1, 1, 1, 1, 1, 1, 1, 1] [1, 1, 1, 1, 1, 1, 1, 1, 1, 1] [1, 1, 1, 1, 1, 1, 1, 1, 1, 1] [1, 1, 1, 1, 1, 1, 1, 1, 1, 1] [1, 1, 1, 1, 1, 1, 1, 1, 1, 1] [1, 1, 1, 1, 1, 1, 1, 0, 1, 1] [1, 1, 1, 1, 1, 1, 1, 1, 1, 1] [1, 1, 1, 1, 1, 1, 1, 1, 1, 0] [1, 1, 1, 0, 1, 1, 1, 1, 1, 1]
All became optimal solutions in the 25th generation. 25 generations [1, 1, 1, 1, 1, 1, 1, 1, 1, 1] [1, 1, 1, 1, 1, 1, 1, 1, 1, 1] [1, 1, 1, 1, 1, 1, 1, 1, 1, 1] [1, 1, 1, 1, 1, 1, 1, 1, 1, 1] [1, 1, 1, 1, 1, 1, 1, 1, 1, 1] [1, 1, 1, 1, 1, 1, 1, 1, 1, 1] [1, 1, 1, 1, 1, 1, 1, 1, 1, 1] [1, 1, 1, 1, 1, 1, 1, 1, 1, 1] [1, 1, 1, 1, 1, 1, 1, 1, 1, 1] [1, 1, 1, 1, 1, 1, 1, 1, 1, 1]
Here's how to solve the basic OneMax problem.
Next time, I will solve the traveling salesman problem.
Recommended Posts