Historical Development of Evolutionary Computing (M1.1)

Evolutionary computation is an area of computer science that uses ideas from biological evolution to solve computational problems. 

In the 1950s and the 1960s, several computer scientists independently invented different evolutionary computation methods. 

In the 1960s, Rechenberg introduced evolution strategies, a method he used to optimize real-valued parameters for devices such as airfoils. 

A “population” consisted of two individuals, a parent and a child mutated from the parent—each encoding a set of real-valued parameters to be optimized. The fitter of the two was selected to be the parent for the next generation. Mutation consisted of incrementing or decrementing a real value according to a given distribution. The parameters of this distribution were themselves encoded as part of each individual and thus coevolved with the parameters to be optimized. This idea was further developed by Schwefel, and the theory and application of evolution strategies have remained an active area of research.

Fogel, Owens, & Walsh developed evolutionary programming, a technique in which candidate solutions to given tasks were represented as finite-state machines and were evolved by randomly mutating their state-transition diagrams and selecting the fittest. As in evolution strategies, a random mutation was the only source of variation. 


The techniques called genetic algorithms (GAs) were first invented by Holland in the 1960s. GAs are population-based algorithms in which mutation and crossover are sources of random variation. GAs typically worked on strings of bits rather than real-valued parameters. The simple GA given above is close to the algorithm proposed by Holland. Holland’s original proposal also included an “inversion” operator for the reordering of bits on a chromosome. 


In contrast with evolution strategies and evolutionary programming, Holland’s goal was not to design algorithms to solve specific problems, but rather to formally study the phenomenon of adaptation as it occurs in nature and to develop ways in which the mechanisms of natural adaptation might be imported into computer systems.


Several other people working in the 1950s and 1960s developed evolution-inspired algorithms for optimization and machine learning for discussions of this history.


In the last several years there has been widespread interaction among researchers studying various evolutionary computation methods, and the boundaries between evolution strategies, evolutionary programming, genetic algorithms, and other evolutionary computation methods have broken down to some extent.



First Generation 

  • Evolutionary Programming (Fogel)

  • Genetic Algorithm (Holland)

  • Evolutionary Strategies (Rechenberg, Schwefel)


Second Generation

  • Genetic Evolution of Data Structures (Michalewicz)

  • Genetic Evolution of Programs (Koza)

  • Hybrid Genetic Search (Davis)

  • Tabu Search (Glover)


Third Generation EC

  • Artificial Immune Systems (Forrest)

  • Cultural Algorithms (Reynolds)

  • DNA Computing (Adleman)

  • Ant Colony Optimization (Dorigo)

  • Particle Swarm Optimization (Kennedy & Eberhart)

  • Memetic Algorithms

  • Estimation of Distribution Algorithms

No comments:

Post a Comment

Monk and Inversions

using System; public class Solution { public static void Main () { int T = Convert . ToInt32 ( Console . ReadLine...