Beginner Explanation
Imagine you’re trying to find the best recipe for a cookie. You start with a few different recipes (these are like your initial solutions). You bake a batch from each recipe and taste them. The best cookie gets to be the parent for the next batch of cookies. You might mix the ingredients of the best cookie with some new ideas (that’s crossover) and change a few things (that’s mutation). You keep repeating this process, gradually improving your cookie until you find the ultimate recipe. This is similar to how evolutionary optimization works, using nature-inspired methods to improve solutions over time.Technical Explanation
Evolutionary optimization is a heuristic optimization technique inspired by biological evolution. It typically involves a population of candidate solutions that evolve over generations. The process includes selection (choosing the best solutions), crossover (combining solutions), and mutation (randomly altering solutions). For example, using a genetic algorithm in Python can be done as follows: “`python import random # Example fitness function def fitness_function(x): return x ** 2 # Minimize x^2 # Simple genetic algorithm def genetic_algorithm(population_size, generations): population = [random.uniform(-10, 10) for _ in range(population_size)] for generation in range(generations): population.sort(key=fitness_function) # Sort by fitness next_generation = population[:2] # Keep the best two while len(next_generation) < population_size: parent1, parent2 = random.sample(population[:5], 2) # Select parents child = (parent1 + parent2) / 2 + random.uniform(-1, 1) # Crossover and mutation next_generation.append(child) population = next_generation return population[0] # Return best solution best_solution = genetic_algorithm(10, 100) print(best_solution) ```Academic Context
Evolutionary optimization is grounded in the principles of evolutionary biology and has its roots in the field of evolutionary computation. Key algorithms include Genetic Algorithms (GA), Genetic Programming (GP), and Differential Evolution (DE). The seminal paper ‘Genetic Algorithms in Search, Optimization, and Machine Learning’ by Goldberg (1989) laid the groundwork for understanding how these algorithms mimic natural selection. Mathematically, evolutionary algorithms can be analyzed through concepts such as fitness landscapes, genetic diversity, and convergence properties, with frameworks established in works like ‘An Introduction to Genetic Algorithms’ by Melanie Mitchell.Code Examples
Example 1:
import random
# Example fitness function
def fitness_function(x):
return x ** 2 # Minimize x^2
# Simple genetic algorithm
def genetic_algorithm(population_size, generations):
population = [random.uniform(-10, 10) for _ in range(population_size)]
for generation in range(generations):
population.sort(key=fitness_function) # Sort by fitness
next_generation = population[:2] # Keep the best two
while len(next_generation) < population_size:
parent1, parent2 = random.sample(population[:5], 2) # Select parents
child = (parent1 + parent2) / 2 + random.uniform(-1, 1) # Crossover and mutation
next_generation.append(child)
population = next_generation
return population[0] # Return best solution
best_solution = genetic_algorithm(10, 100)
print(best_solution)
Example 2:
return x ** 2 # Minimize x^2
Example 3:
population = [random.uniform(-10, 10) for _ in range(population_size)]
for generation in range(generations):
population.sort(key=fitness_function) # Sort by fitness
next_generation = population[:2] # Keep the best two
while len(next_generation) < population_size:
parent1, parent2 = random.sample(population[:5], 2) # Select parents
child = (parent1 + parent2) / 2 + random.uniform(-1, 1) # Crossover and mutation
next_generation.append(child)
population = next_generation
return population[0] # Return best solution
Example 4:
import random
# Example fitness function
def fitness_function(x):
return x ** 2 # Minimize x^2
Example 5:
def fitness_function(x):
return x ** 2 # Minimize x^2
# Simple genetic algorithm
def genetic_algorithm(population_size, generations):
Example 6:
def genetic_algorithm(population_size, generations):
population = [random.uniform(-10, 10) for _ in range(population_size)]
for generation in range(generations):
population.sort(key=fitness_function) # Sort by fitness
next_generation = population[:2] # Keep the best two
View Source: https://arxiv.org/abs/2511.15377v1