top of page
  • Writer's pictureAdisorn O.

Genetic Algorithm: Pseudo Code

Updated: May 29, 2023

(The code as generated by CHAT-GPT with my commentary as shown with ##)


# Define parameters for Genetic Algorithm

## Commentary: This defines the control parameters for GA solver


POPULATION_SIZE = 100

GENERATIONS = 100

MUTATION_PROBABILITY = 0.1

CROSSOVER_POINTS = 2


# Define initial population

## Commentary: As same as most Meta-Heuristic searches, we must randomize the first set of tested solutions for the amount as defined by the control parameter

population = generate_population(POPULATION_SIZE)


# Evaluate fitness of initial population

## Commentary: We must evaluate fitness (the value of the objective function) for each population (chromosome). This is used as the attribute for the selection process.


fitness_scores = evaluate_fitness(population)


## Commentary: Loop for each generation

for generation in range(GENERATIONS):

# Select parents for reproduction

## Commentary: GA improves the quality of the solution by selecting pairs of the parents from the best populations. Two processes are commonly used: Roulette Wheel selection method and contending method.

parents = selection(population, fitness_scores)


# Create offspring through crossover

## Commentary: A pair of best parents are recombined to create two new children. The solution (genes) are interchanged or so-called crossover at random or specific locations. Crossover gives sense of exploitation to achieve a better set of solutions from the existing ones.

offspring = crossover(parents, CROSSOVER_POINTS)


## Commentary: Each child is mutated by randomly changing one or many of its gene. The mutation happens with much lower probability than crossover in order to exploit the solution outside the existing set of the solutions. Without mutation, we can't achieve the global optimum point.

# Apply mutation to some offspring

mutated_offspring = mutation(offspring, MUTATION_PROBABILITY)


# Evaluate fitness of new offspring

offspring_fitness_scores = evaluate_fitness(mutated_offspring)



## Commentary: Good parents are retained, so-called Elitism, to the next generation. Although this is not necessary process, but it helps to stabilize the solution and cut down the iteration step.

# Replace worst individuals in population with offspring

population = replace_population(population, fitness_scores, mutated_offspring, offspring_fitness_scores)



## Commentary: The existing population are replaced by the new better population.

# Update fitness scores for new population

fitness_scores = evaluate_fitness(population)


# Return best individual found

best_individual = get_best_individual(population, fitness_scores)

return best_individual


9 views

Comentarios


bottom of page