Evolution Strategies

Beginner Explanation

Imagine you are trying to find the best way to stack blocks to build the tallest tower, but you can’t see the tower from the side. Instead, you can only feel how tall it is when you finish stacking. Evolution Strategies work like a team of builders who try different ways to stack the blocks. Some builders try stacking them one way, while others try a different way. After each attempt, they share what they learned about which tower was tallest. Over time, they learn from each other and improve their stacking techniques until they find the tallest possible tower. This is how Evolution Strategies help to find the best solutions to tough problems, even when the path isn’t clear.

Technical Explanation

Evolution Strategies (ES) are optimization algorithms inspired by natural evolution principles. They work by iteratively improving a population of candidate solutions. Each candidate is evaluated based on a fitness function, which measures how well it solves the problem. The best-performing candidates are selected to produce offspring through mutation and recombination. A common variant is the (μ, λ)-ES, where μ parents generate λ offspring. The algorithm can be implemented in Python as follows: “`python import numpy as np # Objective function def objective_function(x): return -np.sum(x**2) # Minimize the negative sum of squares # Evolution Strategy def evolution_strategy(num_parents, num_offspring, num_generations): population = np.random.randn(num_parents, 2) # 2D problem for _ in range(num_generations): fitness = np.array([objective_function(ind) for ind in population]) parents = population[np.argsort(fitness)[-num_parents:]] # Select best parents offspring = parents + np.random.randn(num_offspring, 2) * 0.5 # Mutation population = np.vstack((parents, offspring)) # New population return population[np.argmax(fitness)] # Return best solution best_solution = evolution_strategy(10, 20, 50) print(‘Best solution:’, best_solution) “`

Academic Context

Evolution Strategies (ES) belong to the family of evolutionary algorithms and are a popular approach for black-box optimization problems, especially in high-dimensional spaces where traditional gradient-based methods fail. The theoretical foundation of ES is rooted in the concept of natural selection and genetic evolution, where candidate solutions are treated as individuals in a population. Key papers include ‘Evolution Strategies as a base for the optimization of neural networks’ by Salimans et al. (2017), which discusses the application of ES in training neural networks, and ‘Natural Evolution Strategies’ by Wierstra et al. (2008), which introduces the algorithmic framework and theoretical analysis. The mathematical underpinnings involve concepts from stochastic processes and population dynamics, focusing on the convergence properties and efficiency of different mutation strategies.

Code Examples

Example 1:

import numpy as np

# Objective function
def objective_function(x):
    return -np.sum(x**2)  # Minimize the negative sum of squares

# Evolution Strategy
def evolution_strategy(num_parents, num_offspring, num_generations):
    population = np.random.randn(num_parents, 2)  # 2D problem
    for _ in range(num_generations):
        fitness = np.array([objective_function(ind) for ind in population])
        parents = population[np.argsort(fitness)[-num_parents:]]  # Select best parents
        offspring = parents + np.random.randn(num_offspring, 2) * 0.5  # Mutation
        population = np.vstack((parents, offspring))  # New population
    return population[np.argmax(fitness)]  # Return best solution

best_solution = evolution_strategy(10, 20, 50)
print('Best solution:', best_solution)

Example 2:

return -np.sum(x**2)  # Minimize the negative sum of squares

Example 3:

population = np.random.randn(num_parents, 2)  # 2D problem
    for _ in range(num_generations):
        fitness = np.array([objective_function(ind) for ind in population])
        parents = population[np.argsort(fitness)[-num_parents:]]  # Select best parents
        offspring = parents + np.random.randn(num_offspring, 2) * 0.5  # Mutation
        population = np.vstack((parents, offspring))  # New population
    return population[np.argmax(fitness)]  # Return best solution

Example 4:

import numpy as np

# Objective function
def objective_function(x):
    return -np.sum(x**2)  # Minimize the negative sum of squares

Example 5:

def objective_function(x):
    return -np.sum(x**2)  # Minimize the negative sum of squares

# Evolution Strategy
def evolution_strategy(num_parents, num_offspring, num_generations):

Example 6:

def evolution_strategy(num_parents, num_offspring, num_generations):
    population = np.random.randn(num_parents, 2)  # 2D problem
    for _ in range(num_generations):
        fitness = np.array([objective_function(ind) for ind in population])
        parents = population[np.argsort(fitness)[-num_parents:]]  # Select best parents

View Source: https://arxiv.org/abs/2511.16652v1