Metaheuristic
Encyclopedia
In computer science
Computer science
Computer science or computing science is the study of the theoretical foundations of information and computation and of practical techniques for their implementation and application in computer systems...

, metaheuristic designates a computational method that optimizes
Optimization (mathematics)
In mathematics, computational science, or management science, mathematical optimization refers to the selection of a best element from some set of available alternatives....

 a problem
Optimization problem
In mathematics and computer science, an optimization problem is the problem of finding the best solution from all feasible solutions. Optimization problems can be divided into two categories depending on whether the variables are continuous or discrete. An optimization problem with discrete...

 by iteratively
Iterative method
In computational mathematics, an iterative method is a mathematical procedure that generates a sequence of improving approximate solutions for a class of problems. A specific implementation of an iterative method, including the termination criteria, is an algorithm of the iterative method...

 trying to improve a candidate solution
Candidate solution
In optimization , a candidate solution is a member of a set of possible solutions to a given problem. A candidate solution does not have to be a likely or reasonable solution to the problem – it is simply in the set that satisfies all constraints.The space of all candidate solutions is called the...

 with regard to a given measure of quality. Metaheuristics make few or no assumptions about the problem being optimized and can search very large spaces of candidate solutions. However, metaheuristics do not guarantee an optimal solution is ever found. Many metaheuristics implement some form of stochastic optimization
Stochastic optimization
Stochastic optimization methods are optimization methods that generate and use random variables. For stochastic problems, the random variables appear in the formulation of the optimization problem itself, which involve random objective functions or random constraints, for example. Stochastic...

.

Other terms having a similar meaning as metaheuristic, are: derivative-free, direct search, black-box, or indeed just heuristic optimizer. Several books and survey papers have been published on the subject.

Applications

Metaheuristics are used for combinatorial optimization
Combinatorial optimization
In applied mathematics and theoretical computer science, combinatorial optimization is a topic that consists of finding an optimal object from a finite set of objects. In many such problems, exhaustive search is not feasible...

 in which an optimal solution is sought over a discrete
Discrete mathematics
Discrete mathematics is the study of mathematical structures that are fundamentally discrete rather than continuous. In contrast to real numbers that have the property of varying "smoothly", the objects studied in discrete mathematics – such as integers, graphs, and statements in logic – do not...

 search-space. An example problem is the travelling salesman problem
Travelling salesman problem
The travelling salesman problem is an NP-hard problem in combinatorial optimization studied in operations research and theoretical computer science. Given a list of cities and their pairwise distances, the task is to find the shortest possible tour that visits each city exactly once...

 where the search-space of candidate solutions grows more than exponentially as the size of the problem increases, which makes an exhaustive search for the optimal solution infeasible. Additionally, multidimensional combinatorial problems, including most design problems in engineering
Engineering
Engineering is the discipline, art, skill and profession of acquiring and applying scientific, mathematical, economic, social, and practical knowledge, in order to design and build structures, machines, devices, systems, materials and processes that safely realize improvements to the lives of...

 such as form-finding and behavior-finding, suffer from the curse of dimensionality
Curse of dimensionality
The curse of dimensionality refers to various phenomena that arise when analyzing and organizing high-dimensional spaces that do not occur in low-dimensional settings such as the physical space commonly modeled with just three dimensions.There are multiple phenomena referred to by this name in...

, which also makes them infeasible for exhaustive search or analytical methods. Popular metaheuristics for combinatorial problems include simulated annealing
Simulated annealing
Simulated annealing is a generic probabilistic metaheuristic for the global optimization problem of locating a good approximation to the global optimum of a given function in a large search space. It is often used when the search space is discrete...

 by Kirkpatrick et al., genetic algorithms by Holland et al., ant colony optimization
Ant colony optimization
In computer science and operations research, the ant colony optimization algorithm ' is a probabilistic technique for solving computational problems which can be reduced to finding good paths through graphs....

 by Dorigo, scatter search and tabu search
Tabu search
Tabu search is a mathematical optimization method, belonging to the class of trajectory based techniques. Tabu search enhances the performance of a local search method by using memory structures that describe the visited solutions: once a potential solution has been determined, it is marked as...

 by Glover.

Metaheuristics are also used for problems over real-valued
Real number
In mathematics, a real number is a value that represents a quantity along a continuum, such as -5 , 4/3 , 8.6 , √2 and π...

 search-spaces, where the classic way of optimization is to derive the gradient
Gradient
In vector calculus, the gradient of a scalar field is a vector field that points in the direction of the greatest rate of increase of the scalar field, and whose magnitude is the greatest rate of change....

 of the function to be optimized and then employ gradient descent
Gradient descent
Gradient descent is a first-order optimization algorithm. To find a local minimum of a function using gradient descent, one takes steps proportional to the negative of the gradient of the function at the current point...

 or a quasi-Newton method
Quasi-Newton method
In optimization, quasi-Newton methods are algorithms for finding local maxima and minima of functions. Quasi-Newton methods are based on...

. Metaheuristics do not use the gradient or Hessian matrix
Hessian matrix
In mathematics, the Hessian matrix is the square matrix of second-order partial derivatives of a function; that is, it describes the local curvature of a function of many variables. The Hessian matrix was developed in the 19th century by the German mathematician Ludwig Otto Hesse and later named...

 so their advantage is that the function to be optimized need not be continuous
Continuous function
In mathematics, a continuous function is a function for which, intuitively, "small" changes in the input result in "small" changes in the output. Otherwise, a function is said to be "discontinuous". A continuous function with a continuous inverse function is called "bicontinuous".Continuity of...

 or differentiable and it can also have constraints
Constraint (mathematics)
In mathematics, a constraint is a condition that a solution to an optimization problem must satisfy. There are two types of constraints: equality constraints and inequality constraints...

. Popular metaheuristic optimizers for real-valued search-spaces include particle swarm optimization
Particle swarm optimization
In computer science, particle swarm optimization is a computational method that optimizes a problem by iteratively trying to improve a candidate solution with regard to a given measure of quality...

 by Eberhart and Kennedy, differential evolution
Differential evolution
In computer science, differential evolution is a method that optimizes a problem by iteratively trying to improve a candidate solution with regard to a given measure of quality. Such methods are commonly known as metaheuristics as they make few or no assumptions about the problem being optimized...

 by Storn and Price and evolution strategies by Rechenberg and Schwefel.

Metaheuristics based on decomposition techniques have also been proposed for tackling hard combinatorial problems of large size.

Main contributions

Many different metaheuristics are in existence and new variants are continually being proposed. Some of the most significant contributions to the field are:
  • 1952: Robbins and Monro work on stochastic optimization methods.
  • 1952: Fermi
    Enrico Fermi
    Enrico Fermi was an Italian-born, naturalized American physicist particularly known for his work on the development of the first nuclear reactor, Chicago Pile-1, and for his contributions to the development of quantum theory, nuclear and particle physics, and statistical mechanics...

     and Metropolis
    Nicholas Metropolis
    Nicholas Constantine Metropolis was a Greek American physicist.-Work:Metropolis received his B.Sc. and Ph.D. degrees in physics at the University of Chicago...

     develop an early form of pattern search
    Pattern search (optimization)
    Pattern search is a family of numerical optimization methods that do not require the gradient of the problem to be optimized. Hence PS can be used on functions that are not continuous or differentiable. Such optimization methods are also known as direct-search, derivative-free, or black-box...

     as described belatedly by Davidon.
  • 1954: Barricelli carry out the first simulations of the evolution
    Evolution
    Evolution is any change across successive generations in the heritable characteristics of biological populations. Evolutionary processes give rise to diversity at every level of biological organisation, including species, individual organisms and molecules such as DNA and proteins.Life on Earth...

     process and use them on general optimization problems.
  • 1963: Rastrigin proposes random search
    Random search
    Random search is a family of numerical optimization methods that do not require the gradient of the problem to be optimized and RS can hence be used on functions that are not continuous or differentiable...

    .
  • 1965: Matyas proposes random optimization
    Random optimization
    Random optimization is a family of numerical optimization methods that do not require the gradient of the problem to be optimized and RO can hence be used on functions that are not continuous or differentiable...

    .
  • 1965: Rechenberg proposes evolution strategies.
  • 1965: Nelder
    John Nelder
    John Ashworth Nelder FRS was a British statistician known for his contributions to experimental design, analysis of variance, computational statistics, and statistical theory.-Contributions:...

     and Mead propose a simplex heuristic
    Nelder-Mead method
    The Nelder–Mead method or downhill simplex method or amoeba method is a commonly used nonlinear optimization technique, which is a well-defined numerical method for twice differentiable and unimodal problems...

    , which was shown by Powell
    Michael J. D. Powell
    Michael James David Powell FRS, FIMA is a British mathematics Professor, retired from Cambridge University, where he earned his bachelors degree and, in 1979, his D.Sc. . He is known for his extensive work in numerical analysis, especially nonlinear optimization and Approximation...

     to converge to non-stationary points on some problems.
  • 1966: Fogel et al. propose evolutionary programming
    Evolutionary programming
    Evolutionary programming is one of the four major evolutionary algorithm paradigms. It is similar to genetic programming, but the structure of the program to be optimized is fixed, while its numerical parameters are allowed to evolve....

    .
  • 1970: Hastings proposes the Metropolis-Hastings algorithm
    Metropolis-Hastings algorithm
    In mathematics and physics, the Metropolis–Hastings algorithm is a Markov chain Monte Carlo method for obtaining a sequence of random samples from a probability distribution for which direct sampling is difficult...

    .
  • 1970: Cavicchio proposes adaptation of control parameters for an optimizer.
  • 1970: Kernighan and Lin propose a graph partitioning method, related to variable-depth search and prohibition-based (tabu) search
    Tabu search
    Tabu search is a mathematical optimization method, belonging to the class of trajectory based techniques. Tabu search enhances the performance of a local search method by using memory structures that describe the visited solutions: once a potential solution has been determined, it is marked as...

    .
  • 1975: Holland
    John Holland
    John Holland may refer to:*Sir John Holland, 1st Baronet , English politician*Sir John Holland, 2nd Baronet , British politician*John Holland, on the Los Angeles County Civil Defense and Disaster Commission in 1960s...

     proposes the genetic algorithm
    Genetic algorithm
    A genetic algorithm is a search heuristic that mimics the process of natural evolution. This heuristic is routinely used to generate useful solutions to optimization and search problems...

    .
  • 1977: Glover
    Fred Glover
    Fredrick Austin Glover was a former NHL and AHL player and coach. He was the brother of Howie Glover, who also played in the NHL.-Coaching career:Fred Glover was the head coach of the following:...

     proposes Scatter Search.
  • 1978: Mercer and Sampson propose a metaplan
    Meta-optimization
    In numerical optimization, meta-optimization is the use of one optimization method to tune another optimization method. Meta-optimization is reported to have been used as early as in the late 1970s by Mercer and Sampson for finding optimal parameter settings of a genetic algorithm...

     for tuning an optimizer's parameters by using another optimizer.
  • 1980: Smith describes genetic programming
    Genetic programming
    In artificial intelligence, genetic programming is an evolutionary algorithm-based methodology inspired by biological evolution to find computer programs that perform a user-defined task. It is a specialization of genetic algorithms where each individual is a computer program...

    .
  • 1983: Kirkpatrick et al. propose simulated annealing
    Simulated annealing
    Simulated annealing is a generic probabilistic metaheuristic for the global optimization problem of locating a good approximation to the global optimum of a given function in a large search space. It is often used when the search space is discrete...

    .
  • 1986: Glover
    Fred W. Glover
    Fred W. Glover is a professor of computer science at the University of Colorado, and the Chief Technology Officer at OptTek. He is best known as the inventor of the tabu search method, and coined the term "metaheuristic".. Fred Glover was born on March 8, 1937 in Kansas City,...

     proposes tabu search
    Tabu search
    Tabu search is a mathematical optimization method, belonging to the class of trajectory based techniques. Tabu search enhances the performance of a local search method by using memory structures that describe the visited solutions: once a potential solution has been determined, it is marked as...

    , first mention of the term metaheuristic.
  • 1986: Farmer et al. work on the artificial immune system
    Artificial immune system
    In computer science, Artificial immune systems are a class of computationally intelligent systems inspired by the principles and processes of the vertebrate immune system...

    .
  • 1986: Grefenstette proposes another early form of metaplan
    Meta-optimization
    In numerical optimization, meta-optimization is the use of one optimization method to tune another optimization method. Meta-optimization is reported to have been used as early as in the late 1970s by Mercer and Sampson for finding optimal parameter settings of a genetic algorithm...

     for tuning an optimizer's parameters by using another optimizer.
  • 1988: First conference on genetic algorithms is organized at the University of Illinois at Urbana-Champaign
    University of Illinois at Urbana-Champaign
    The University of Illinois at Urbana–Champaign is a large public research-intensive university in the state of Illinois, United States. It is the flagship campus of the University of Illinois system...

    .
  • 1988: Koza
    John Koza
    John R. Koza is a computer scientist and a former consulting professor at Stanford University, most notable for his work in pioneering the use of genetic programming for the optimization of complex problems. He was a cofounder of Scientific Games Corporation, a company which built computer systems...

     registers his first patent on genetic programming.
  • 1989: Goldberg publishes a well known book on genetic algorithms.
  • 1989: Evolver
    Evolver (software)
    Evolver is a software package that allows users to solve a wide variety of optimization problems using a genetic algorithm. Launched in 1990, it was the first commercially available genetic algorithm package for personal computers. The program was originally developed by Axcelis, Inc. and is now...

    , the first optimization software using the genetic algorithm.
  • 1989: Moscato proposes the memetic algorithm
    Memetic algorithm
    Memetic algorithms represent one of the recent growing areas of research in evolutionary computation. The term MA is now widely used as a synergy of evolutionary or any population-based approach with separate individual learning or local improvement procedures for problem search...

    .
  • 1991: Interactive evolutionary computation
    Interactive evolutionary computation
    Interactive evolutionary computation or aesthetic selection is a general term for methods of evolutionary computation that use human evaluation...

    .
  • 1992: Dorigo
    Marco Dorigo
    Marco Dorigo is a research director for the Belgian Funds for Scientific Research , a professor in the computer science department of the University of Paderborn and a co-director of , the artificial intelligence lab of the Université Libre de Bruxelles.He is the proponent of the "ant colony...

     proposes the ant colony algorithm.
  • 1993: The journal, Evolutionary Computation, begins publication by the Massachusetts Institute of Technology
    Massachusetts Institute of Technology
    The Massachusetts Institute of Technology is a private research university located in Cambridge, Massachusetts. MIT has five schools and one college, containing a total of 32 academic departments, with a strong emphasis on scientific and technological education and research.Founded in 1861 in...

    .
  • 1993: Fonseca and Fleming propose MOGA for multiobjective optimization
    Multiobjective optimization
    Multi-objective optimization , also known as multi-criteria or multi-attribute optimization, is the process of simultaneously optimizing two or more conflicting objectives subject to certain constraints....

    .
  • 1994: Battiti
    Roberto Battiti
    -References:...

     and Tecchiolli propose Reactive Search Optimization(RSO)
    Reactive search optimization
    Reactive search optimization defines local-search heuristics based on machine learning, a family of optimization algorithms based on the local search techniques...

     principles for the online self-tuning of heuristics.
  • 1994: Srinivas and Deb propose NSGA for multiobjective optimization
    Multiobjective optimization
    Multi-objective optimization , also known as multi-criteria or multi-attribute optimization, is the process of simultaneously optimizing two or more conflicting objectives subject to certain constraints....

    .
  • 1995: Kennedy and Eberhart propose particle swarm optimization
    Particle swarm optimization
    In computer science, particle swarm optimization is a computational method that optimizes a problem by iteratively trying to improve a candidate solution with regard to a given measure of quality...

    .
  • 1995: Wolpert and Macready prove the no free lunch theorems.
  • 1996: Mühlenbein and Paaß work on the estimation of distribution algorithm.
  • 1996: Hansen and Ostermeier propose CMA-ES
    CMA-ES
    CMA-ES stands for Covariance Matrix Adaptation Evolution Strategy. Evolution strategies are stochastic, derivative-free methods for numerical optimization of non-linear or non-convex continuous optimization problems. They belong to the class of evolutionary algorithms and evolutionary computation...

    .
  • 1997: Storn and Price propose differential evolution
    Differential evolution
    In computer science, differential evolution is a method that optimizes a problem by iteratively trying to improve a candidate solution with regard to a given measure of quality. Such methods are commonly known as metaheuristics as they make few or no assumptions about the problem being optimized...

    .
  • 1997: Rubinstein proposes the cross entropy method.
  • 1999: Taillard and Voss propose POPMUSIC.
  • 2001: Geem et al. propose harmony search
    Harmony search
    In computer science and operations research, harmony search is a phenomenon-mimicking algorithm inspired by the improvisation process of musicians...

    .
  • 2002: Deb et al. propose NSGA-II for multiobjective optimization
    Multiobjective optimization
    Multi-objective optimization , also known as multi-criteria or multi-attribute optimization, is the process of simultaneously optimizing two or more conflicting objectives subject to certain constraints....

    .
  • 2004: Nakrani and Tovey propose bee colony optimization.
  • 2005: Krishnanand and Ghose propose Glowworm swarm optimization
    Glowworm swarm optimization
    The glowworm swarm optimization is a swarm intelligence optimization algorithm developed based on the behaviour of glowworms...

    .
  • 2005: Karaboga proposes Artificial Bee Colony Algorithm
    Artificial Bee Colony Algorithm
    In computer science and operations research, Artificial Bee Colony Algorithm is an optimization algorithm based on the intelligent foraging behaviour of honey bee swarm, proposed by Karaboga in 2005 .-Algorithm:...

     (ABC).
  • 2006: Haddad et al. introduces honey-bee mating optimization.
  • 2007: Hamed Shah-Hosseini introduces Intelligent Water Drops.
  • 2008: Wierstra et al. propose natural evolution strategies based on the natural gradient.
  • 2008: Yang
    Xin-she Yang
    Xin-She Yang has a DPhil in applied mathematics from Oxford UniversityHe is currently a Senior Research Scientist at National Physical Laboratory, andis the Editor-in-Chief of Int...

     introduces firefly algorithm
    Firefly algorithm
    The firefly algorithm is a metaheuristic algorithm, inspired by the flashing behaviour of fireflies. The primary purpose for a firefly's flash is to act as a signal system to attract other fireflies...

    .
  • 2008: Mucherino and Seref propose the Monkey Search
  • 2009: Yang
    Xin-she Yang
    Xin-She Yang has a DPhil in applied mathematics from Oxford UniversityHe is currently a Senior Research Scientist at National Physical Laboratory, andis the Editor-in-Chief of Int...

     and Deb introduce cuckoo search
    Cuckoo search
    Cuckoo search is an optimization algorithm developed by Xin-she Yang and Suash Debin 2009. It was inspired by the obligate brood parasitism of some cuckoo species by laying their eggs in the nests of other host birds . Some host birds can engage direct conflict with the intruding cuckoos...

    .
  • 2011: Hamed Shah-Hosseini proposes the Galaxy-based Search Algorithm.
  • 2011: Tamura and Yasuda propose Spiral Optimization
    Spiral Optimization
    Spiral Optimization is a metaheuristics algorithm developed by Kenichi Tamura and Keiichiro Yasuda.It was inspired by spiral phenomena in nature....

    .

Criticism

Mathematical analyses of metaheuristics have been presented in the literature, see e.g. Holland's schema theorem
Holland's Schema Theorem
Holland's schema theorem is widely taken to be the foundation for explanations of the power of genetic algorithms. It was proposed by John Holland in the 1970s....

 for the genetic algorithm
Genetic algorithm
A genetic algorithm is a search heuristic that mimics the process of natural evolution. This heuristic is routinely used to generate useful solutions to optimization and search problems...

, the work by Trelea, amongst others, for analysis of particle swarm optimization
Particle swarm optimization
In computer science, particle swarm optimization is a computational method that optimizes a problem by iteratively trying to improve a candidate solution with regard to a given measure of quality...

, and Zaharie for analysis of differential evolution
Differential evolution
In computer science, differential evolution is a method that optimizes a problem by iteratively trying to improve a candidate solution with regard to a given measure of quality. Such methods are commonly known as metaheuristics as they make few or no assumptions about the problem being optimized...

. These analyses make a number of assumptions in regard to the optimizer variants and the simplicity of the optimization problems which limit their validity in real-world optimization scenarios. Performance and convergence aspects of metaheuristic optimizers are therefore often demonstrated empirical
Empirical
The word empirical denotes information gained by means of observation or experimentation. Empirical data are data produced by an experiment or observation....

ly in the research literature. This has been criticized in the no free lunch set of theorems by Wolpert and Macready, which, among other things, prove that all optimizers perform equally well when averaged over all problems. The practical relevance of the no free lunch theorems however is minor, because they will generally not hold on the collection of problems a practitioner is facing.

For the practitioner the most relevant issue is that metaheuristics are not guaranteed to find the optimum or even a satisfactory near-optimal solution. All metaheuristics will eventually encounter problems on which they perform poorly and the practitioner must gain experience in which optimizers work well on different classes of problems.

See also

Metaheuristic methods are, generally speaking, sub-fields of:
Optimization algorithms
  • Heuristic
    Heuristic
    Heuristic refers to experience-based techniques for problem solving, learning, and discovery. Heuristic methods are used to speed up the process of finding a satisfactory solution, where an exhaustive search is impractical...

  • Stochastic search


Sub-fields of metaheuristics include:
  • Local search
    Local search (optimization)
    In computer science, local search is a metaheuristic method for solving computationally hard optimization problems. Local search can be used on problems that can be formulated as finding a solution maximizing a criterion among a number of candidate solutions...

  • Evolutionary computing which includes, amongst others:
  • Evolutionary algorithms
    • Swarm Intelligence
      Swarm intelligence
      Swarm intelligence is the collective behaviour of decentralized, self-organized systems, natural or artificial. The concept is employed in work on artificial intelligence...



Other fields of interest:
  • Reactive Search Optimization
    Reactive search optimization
    Reactive search optimization defines local-search heuristics based on machine learning, a family of optimization algorithms based on the local search techniques...

  • Meta-optimization
    Meta-optimization
    In numerical optimization, meta-optimization is the use of one optimization method to tune another optimization method. Meta-optimization is reported to have been used as early as in the late 1970s by Mercer and Sampson for finding optimal parameter settings of a genetic algorithm...

  • Hyper-heuristics

External links

The source of this article is wikipedia, the free encyclopedia.  The text of this article is licensed under the GFDL.
 
x
OK