Psychology Dictionary of Arguments

Home Screenshot Tabelle Begriffe

Genetic algorithms: Genetic algorithms are a type of evolutionary algorithm that is inspired by the process of natural selection. They are used to solve optimization problems by generating and evolving a population of candidate solutions. Genetic algorithms work by first creating a population of randomly generated solutions. Each solution is then evaluated to determine its fitness, which is a measure of how well it solves the problem. See also Computer Programming, Software, Algorithms, Artificial Intelligence.
Annotation: The above characterizations of concepts are neither definitions nor exhausting presentations of problems related to them. Instead, they are intended to give a short introduction to the contributions below. – Lexicon of Arguments.

Author Concept Summary/Quotes Sources

Peter Norvig on Genetic Algorithms - Dictionary of Arguments

Norvig I 126
Genetic Algorithms/Norvig/Russell: A genetic algorithm (or GA) is a variant of stochastic beam search in which successor states are generated by combining two parent states rather than by modifying a single state. The analogy to natural selection is the same as in stochastic beam search, except that now we are dealing with sexual rather than asexual reproduction.
Norvig I 127
Like beam searches, GAs begin with a set of k randomly generated states, called the population. Each state, or individual, is represented as a string over a finite alphabet – most commonly, a string of 0s and 1s.
Norvig I 128
Like stochastic beam search, genetic algorithms combine an uphill tendency with random exploration and exchange of information among parallel search threads. The primary advantage, if any, of genetic algorithms comes from the crossover operation. Yet it can be shown mathematically that, if the positions of the genetic code are permuted initially in a random order, crossover conveys no advantage.
Norvig I 155
In the 1950s, several statisticians, including Box (1957)(1) and Friedman (1959)(2), used evolutionary techniques for optimization problems, but it wasn’t until Rechenberg (1965)(3) introduced evolution strategies to solve optimization problems for airfoils that the approach gained popularity. In the 1960s and 1970s, John Holland (1975)(4) championed genetic algorithms, both as a useful tool and as a method to expand our understanding of adaptation, biological or otherwise (Holland, 1995)(5). The artificial life movement (Langton, 1995)(6) takes this idea one step further, viewing the products of genetic algorithms as organisms rather than solutions to problems.
VsGenetic algorithms: Most comparisons of genetic algorithms to other approaches (especially stochastic hill climbing) have found that the genetic algorithms are slower to converge (O’Reilly and Oppacher, 1994(7); Mitchell et al., 1996(8); Juels and Wattenberg, 1996(9); Baluja, 1997)(10).
VsVs: Such findings are not universally popular within the GA community, but recent attempts within that community to understand population-based search as an approximate form of Bayesian learning might help close the gap between the field and its critics (Pelikan et al., 1999)(11). >Genetic Programming/Norvig.

1. Box, G. E. P. (1957). Evolutionary operation: A method of increasing industrial productivity. Applied
Statistics, 6, 81–101.
2. Friedman, G. J. (1959). Digital simulation of an evolutionary process. General Systems Yearbook, 4, 171–184.
3. Rechenberg, I. (1965). Cybernetic solution path of an experimental problem. Library translation 1122, Royal Aircraft Establishment
4. Holland, J. H. (1975). Adaption in Natural and Artificial Systems. University of Michigan Press.
5. Holland, J. H. (1995). Hidden Order: How Adaptation Builds Complexity. Addison-Wesley.
6. Langton, C. (Ed.). (1995). Artificial Life. MIT Press.
7. O’Reilly, U.-M. and Oppacher, F. (1994). Program search with a hierarchical variable length representation: Genetic programming, simulated annealing and hill climbing. In Proc. Third Conference on Parallel Problem Solving from Nature, pp. 397–406
8. Mitchell, M., Holland, J. H., and Forrest, S. (1996). When will a genetic algorithm outperform hill climbing? In Cowan, J., Tesauro, G., and Alspector, J. (Eds.), NIPS 6. MIT Press.
9. Juels, A. and Wattenberg, M. (1996). Stochastic hill climbing as a baseline method for evaluating genetic algorithms. In Touretzky, D. S., Mozer, M. C., and Hasselmo, M. E. (Eds.), NIPS 8, pp. 430–6.
MIT Press.
10. Baluja, S. (1997). Genetic algorithms and explicit search statistics. In Mozer, M. C., Jordan, M. I., and Petsche, T. (Eds.), NIPS 9, pp. 319–325. MIT Press
11. Pelikan, M., Goldberg, D. E., and Cantu-Paz, E. (1999). BOA: The Bayesian optimization algorithm.
In GECCO-99: Proc. Genetic and Evolutionary Computation Conference, pp. 525–532.

Explanation of symbols: Roman numerals indicate the source, arabic numerals indicate the page number. The corresponding books are indicated on the right hand side. ((s)…): Comment by the sender of the contribution. Translations: Dictionary of Arguments
The note [Concept/Author], [Author1]Vs[Author2] or [Author]Vs[term] resp. "problem:"/"solution:", "old:"/"new:" and "thesis:" is an addition from the Dictionary of Arguments. If a German edition is specified, the page numbers refer to this edition.

Norvig I
Peter Norvig
Stuart J. Russell
Artificial Intelligence: A Modern Approach Upper Saddle River, NJ 2010

Send Link
> Counter arguments against Norvig
> Counter arguments in relation to Genetic Algorithms

Authors A   B   C   D   E   F   G   H   I   J   K   L   M   N   O   P   Q   R   S   T   U   V   W   Z  

Concepts A   B   C   D   E   F   G   H   I   J   K   L   M   N   O   P   Q   R   S   T   U   V   W   Y   Z