Back to Blog

Genetic Algorithms: Evolving Solutions Through Selection, Crossover, and Mutation

How Darwinian evolution can be turned into an optimization algorithm, and when genetic algorithms outperform gradient-based methods.

2025-09-18
Share
Artificial Lifegenetic-algorithmsevolutionary-computation

Terminology

Term Definition
Chromosome (Genotype) A candidate solution encoded as a string of symbols (bits, integers, or real numbers) that the GA manipulates
Phenotype The decoded, expressed form of a genotype that is evaluated by the fitness function
Fitness Function A function $f: \text{Genotype} \to \mathbb{R}$ that scores how good a candidate solution is at solving the target problem
Selection The process of choosing parent individuals from the population, biased toward higher fitness, to produce offspring
Crossover (Recombination) A genetic operator that combines segments of two parent chromosomes to produce one or more offspring
Mutation A genetic operator that randomly alters one or more genes in a chromosome, introducing new genetic material
Elitism A strategy that copies the top $k$ individuals unchanged into the next generation, guaranteeing fitness never decreases
Tournament Selection A selection method that picks $t$ random individuals and selects the fittest among them as a parent
Schema A template describing a subset of chromosomes sharing certain gene values at specific positions, central to Holland's Schema Theorem

What & Why

A genetic algorithm (GA) is an optimization technique inspired by biological evolution. Instead of computing gradients or following a mathematical formula, a GA maintains a population of candidate solutions and iteratively improves them through selection, crossover, and mutation.

The core loop mirrors Darwinian evolution:

  1. Evaluate each individual's fitness.
  2. Select the fittest individuals as parents.
  3. Combine parents via crossover to produce offspring.
  4. Randomly mutate some offspring.
  5. Replace the old population with the new one.
  6. Repeat until convergence or a budget is exhausted.

GAs shine in problems where the search space is large, discontinuous, or poorly understood. They do not require a differentiable objective function, making them applicable to combinatorial optimization (traveling salesman, scheduling), design problems (antenna shapes, circuit layouts), and game AI. They are also the foundation of evolutionary computation, a broader family that includes genetic programming, evolution strategies, and neuroevolution.

How It Works

The GA Pipeline

Initialize Population Evaluate Fitness Selection Crossover + Mutation Replace Population repeat

Selection Methods

Roulette wheel (fitness-proportionate): Each individual's selection probability is proportional to its fitness. Simple but can lead to premature convergence if one individual dominates.

Tournament selection: Pick t random individuals, select the best. The parameter t controls selection pressure: larger tournaments favor fitter individuals more strongly.

Rank-based selection: Sort by fitness, assign selection probability based on rank rather than raw fitness. Avoids issues with fitness scaling.

Crossover Operators

Single-point crossover: Choose a random cut point, swap the tails of two parents. For binary chromosomes of length L, there are L-1 possible cut points.

Two-point crossover: Choose two cut points, swap the segment between them.

Uniform crossover: For each gene position, flip a coin to decide which parent contributes. Provides maximum mixing.

Mutation

For binary chromosomes, mutation flips each bit independently with probability p_m (typically 1/L where L is chromosome length). For real-valued chromosomes, mutation adds Gaussian noise: x_i' = x_i + \mathcal{N}(0, \sigma^2).

Complexity Analysis

For a population of size $P$, chromosome length $L$, running for $G$ generations:

Operation Time Notes
Fitness evaluation (per generation) $O(P \cdot C_f)$ $C_f$ = cost of one fitness evaluation
Tournament selection $O(P \cdot t)$ $t$ = tournament size, typically small constant
Crossover (per generation) $O(P \cdot L)$ Copy/swap $L$ genes for $P/2$ pairs
Mutation (per generation) $O(P \cdot L)$ Check each gene for mutation
Total GA run $O(G \cdot P \cdot (C_f + L))$ Fitness evaluation usually dominates

Holland's Schema Theorem provides a theoretical foundation: short, low-order, above-average schemata receive exponentially increasing representation in successive generations. The expected number of instances of a schema $H$ grows as:

$m(H, t+1) \geq m(H, t) \cdot \frac{f(H)}{\bar{f}} \cdot \left(1 - p_c \cdot \frac{\delta(H)}{L-1}\right) \cdot (1 - p_m)^{o(H)}$

where $m(H,t)$ is the count of schema $H$ at generation $t$, $f(H)/\bar{f}$ is the relative fitness, $\delta(H)$ is the defining length, $o(H)$ is the order, $p_c$ is crossover probability, and $p_m$ is mutation probability.

Implementation

ALGORITHM GeneticAlgorithm(popSize, chromLength, maxGen, pCross, pMut, tournSize)
INPUT: popSize: integer, chromLength: integer, maxGen: integer,
       pCross: float, pMut: float, tournSize: integer
OUTPUT: bestIndividual: chromosome with highest fitness found

BEGIN
  population <- ARRAY of popSize random binary strings of length chromLength
  bestEver <- NULL

  FOR gen FROM 1 TO maxGen DO
    // Evaluate fitness
    FOR i FROM 0 TO popSize - 1 DO
      fitness[i] <- EvaluateFitness(population[i])
    END FOR

    // Track best
    bestIdx <- INDEX of maximum in fitness
    IF bestEver = NULL OR fitness[bestIdx] > Fitness(bestEver) THEN
      bestEver <- COPY of population[bestIdx]
    END IF

    // Build next generation
    nextPop <- EMPTY ARRAY

    // Elitism: keep best individual
    APPEND COPY of population[bestIdx] TO nextPop

    WHILE LENGTH(nextPop) < popSize DO
      parent1 <- TournamentSelect(population, fitness, tournSize)
      parent2 <- TournamentSelect(population, fitness, tournSize)

      IF RANDOM() < pCross THEN
        (child1, child2) <- SinglePointCrossover(parent1, parent2)
      ELSE
        child1 <- COPY of parent1
        child2 <- COPY of parent2
      END IF

      child1 <- Mutate(child1, pMut)
      child2 <- Mutate(child2, pMut)

      APPEND child1 TO nextPop
      IF LENGTH(nextPop) < popSize THEN
        APPEND child2 TO nextPop
      END IF
    END WHILE

    population <- nextPop
  END FOR

  RETURN bestEver
END
ALGORITHM TournamentSelect(population, fitness, tournSize)
INPUT: population: array of chromosomes, fitness: array of floats, tournSize: integer
OUTPUT: selected chromosome

BEGIN
  bestIdx <- RANDOM integer in [0, LENGTH(population) - 1]
  FOR i FROM 2 TO tournSize DO
    candidate <- RANDOM integer in [0, LENGTH(population) - 1]
    IF fitness[candidate] > fitness[bestIdx] THEN
      bestIdx <- candidate
    END IF
  END FOR
  RETURN COPY of population[bestIdx]
END
ALGORITHM SinglePointCrossover(parent1, parent2)
INPUT: parent1, parent2: binary strings of length L
OUTPUT: child1, child2: binary strings of length L

BEGIN
  point <- RANDOM integer in [1, L - 1]
  child1 <- parent1[0..point-1] + parent2[point..L-1]
  child2 <- parent2[0..point-1] + parent1[point..L-1]
  RETURN (child1, child2)
END

Real-World Applications

  • Antenna design: NASA's ST5 mission used a GA to evolve antenna shapes that outperformed human-designed antennas, producing irregular but highly effective geometries
  • Scheduling and logistics: GAs solve job-shop scheduling, vehicle routing, and airline crew scheduling where the combinatorial search space is too large for exact methods
  • Circuit and chip design: Evolutionary approaches optimize FPGA configurations and analog circuit topologies
  • Game AI: GAs evolve strategies, level designs, and NPC behaviors in video games
  • Financial modeling: Portfolio optimization and trading strategy discovery use GAs to explore non-convex, multi-objective fitness landscapes
  • Drug discovery: Molecular structure optimization uses GA-like approaches to search chemical space for compounds with desired properties

Key Takeaways

  • Genetic algorithms optimize by mimicking natural selection: evaluate, select, recombine, mutate, repeat
  • They require no gradient information, making them applicable to discontinuous, noisy, or black-box fitness landscapes
  • Total cost is $O(G \cdot P \cdot (C_f + L))$, dominated by fitness evaluation in most practical problems
  • Tournament selection with elitism provides a good balance of exploration and exploitation
  • Holland's Schema Theorem explains why GAs work: short, fit building blocks propagate exponentially across generations