Meta-optimization
Encyclopedia
In numerical optimization
Optimization (mathematics)
In mathematics, computational science, or management science, mathematical optimization refers to the selection of a best element from some set of available alternatives....

, meta-optimization is the use of one optimization method to tune another optimization method. Meta-optimization is reported to have been used as early as in the late 1970s by Mercer and Sampson for finding optimal parameter settings of a genetic algorithm
Genetic algorithm
A genetic algorithm is a search heuristic that mimics the process of natural evolution. This heuristic is routinely used to generate useful solutions to optimization and search problems...

. Meta-optimization is also known in the literature as meta-evolution, super-optimization, automated parameter calibration, hyper-heuristics, etc.

Motivation

Optimization methods such as genetic algorithm
Genetic algorithm
A genetic algorithm is a search heuristic that mimics the process of natural evolution. This heuristic is routinely used to generate useful solutions to optimization and search problems...

 and differential evolution
Differential evolution
In computer science, differential evolution is a method that optimizes a problem by iteratively trying to improve a candidate solution with regard to a given measure of quality. Such methods are commonly known as metaheuristics as they make few or no assumptions about the problem being optimized...

 have several parameters that govern their behaviour and efficacy in optimizing a given problem and these parameters must be chosen by the practitioner to achieve satisfactory results. Selecting the behavioural parameters by hand is a laborious task that is susceptible to human misconceptions of what makes the optimizer perform well.

The behavioural parameters of an optimizer can be varied and the optimization performance plotted as a landscape. This is computationally feasible for optimizers with few behavioural parameters and optimization problems that are fast to compute, but when the number of behavioural parameters increases the time usage for computing such a performance landscape increases exponentially. This is the curse of dimensionality
Curse of dimensionality
The curse of dimensionality refers to various phenomena that arise when analyzing and organizing high-dimensional spaces that do not occur in low-dimensional settings such as the physical space commonly modeled with just three dimensions.There are multiple phenomena referred to by this name in...

 for the search-space consisting of an optimizer's behavioural parameters. An efficient method is therefore needed to search the space of behavioural parameters.

Methods

A simple way of finding good behavioural parameters for an optimizer is to employ another overlaying optimizer, called the meta
Meta
Meta- , is a prefix used in English to indicate a concept which is an abstraction from another concept, used to complete or add to the latter....

-optimizer. There are different ways of doing this depending on whether the behavioural parameters to be tuned are real-valued
Real number
In mathematics, a real number is a value that represents a quantity along a continuum, such as -5 , 4/3 , 8.6 , √2 and π...

 or discrete-valued
Discrete mathematics
Discrete mathematics is the study of mathematical structures that are fundamentally discrete rather than continuous. In contrast to real numbers that have the property of varying "smoothly", the objects studied in discrete mathematics – such as integers, graphs, and statements in logic – do not...

, and depending on what performance measure is being used, etc.

Meta-optimizing the parameters of a genetic algorithm
Genetic algorithm
A genetic algorithm is a search heuristic that mimics the process of natural evolution. This heuristic is routinely used to generate useful solutions to optimization and search problems...

 was done by Grefenstette and Keane, amongst others, and experiments with meta-optimizing both the parameters and the genetic operators were reported by Bäck. Meta-optimization of particle swarm optimization
Particle swarm optimization
In computer science, particle swarm optimization is a computational method that optimizes a problem by iteratively trying to improve a candidate solution with regard to a given measure of quality...

 was done by Meissner et al. as well as by Pedersen and Chipperfield, who also meta-optimized differential evolution
Differential evolution
In computer science, differential evolution is a method that optimizes a problem by iteratively trying to improve a candidate solution with regard to a given measure of quality. Such methods are commonly known as metaheuristics as they make few or no assumptions about the problem being optimized...

. Birattari et al. meta-optimized ant colony optimization
Ant colony optimization
In computer science and operations research, the ant colony optimization algorithm ' is a probabilistic technique for solving computational problems which can be reduced to finding good paths through graphs....

. Statistical models have also been used to reveal more about the relationship between choices of behavioural parameters and optimization performance, see for example Francois and Lavergne, and Nannen and Eiben. A comparison of various meta-optimization techniques was done by Smit and Eiben.
The source of this article is wikipedia, the free encyclopedia.  The text of this article is licensed under the GFDL.
 
x
OK