“遗传算法”的版本间的差异

来自集智百科 - 复杂系统|人工智能|复杂科学|复杂网络|自组织
跳到导航 跳到搜索
(Moved page from wikipedia:en:Genetic algorithm (history))
(没有差异)

2020年11月16日 (一) 21:47的版本

此词条暂由彩云小译翻译,翻译字数共3303,未经人工整理和审校,带来阅读不便,请见谅。

模板:Evolutionary algorithms

文件:St 5-xband-antenna.jpg
The 2006 NASA ST5 spacecraft antenna. This complicated shape was found by an evolutionary computer design program to create the best radiation pattern. It is known as an evolved antenna.

ST5 spacecraft antenna. This complicated shape was found by an evolutionary computer design program to create the best radiation pattern. It is known as an evolved antenna.]]

ST5航天器天线。这个复杂的形状是由一个进化的计算机设计程序发现的,用来创建最佳的辐射图案。它被称为演化天线。]


< ! -- 被删除的图片: ESA/JAXA 行星际轨道奖获得者。这次木星卫星之旅是在一种基于自我适应的进化技术的帮助下发现的

In computer science and operations research, a genetic algorithm (GA) is a metaheuristic inspired by the process of natural selection that belongs to the larger class of evolutionary algorithms (EA). Genetic algorithms are commonly used to generate high-quality solutions to optimization and search problems by relying on biologically inspired operators such as mutation, crossover and selection.模板:Sfn

In computer science and operations research, a genetic algorithm (GA) is a metaheuristic inspired by the process of natural selection that belongs to the larger class of evolutionary algorithms (EA). Genetic algorithms are commonly used to generate high-quality solutions to optimization and search problems by relying on biologically inspired operators such as mutation, crossover and selection.

在计算机科学和运筹学研究中,遗传算法(GA)是一种受自然选择过程启发的元启发式算法,属于进化算法(EA)的较大类别。遗传算法通常依赖于生物学上的变异、交叉和选择算子来生成优化和搜索问题的高质量解。


Methodology

Optimization problems

In a genetic algorithm, a population of candidate solutions (called individuals, creatures, or phenotypes) to an optimization problem is evolved toward better solutions. Each candidate solution has a set of properties (its chromosomes or genotype) which can be mutated and altered; traditionally, solutions are represented in binary as strings of 0s and 1s, but other encodings are also possible.模板:Sfn

In a genetic algorithm, a population of candidate solutions (called individuals, creatures, or phenotypes) to an optimization problem is evolved toward better solutions. Each candidate solution has a set of properties (its chromosomes or genotype) which can be mutated and altered; traditionally, solutions are represented in binary as strings of 0s and 1s, but other encodings are also possible.

在遗传算法中,最佳化问题的候选解决方案(被称为个体、生物或表型)群体向着更好的解决方案进化。每个候选解决方案都有一组可以变异和改变的属性(染色体或基因型) ; 传统上,解决方案用二进制表示为0和1的字符串,但其他编码也是可能的。


The evolution usually starts from a population of randomly generated individuals, and is an iterative process, with the population in each iteration called a generation. In each generation, the fitness of every individual in the population is evaluated; the fitness is usually the value of the objective function in the optimization problem being solved. The more fit individuals are stochastically selected from the current population, and each individual's genome is modified (recombined and possibly randomly mutated) to form a new generation. The new generation of candidate solutions is then used in the next iteration of the algorithm. Commonly, the algorithm terminates when either a maximum number of generations has been produced, or a satisfactory fitness level has been reached for the population.

The evolution usually starts from a population of randomly generated individuals, and is an iterative process, with the population in each iteration called a generation. In each generation, the fitness of every individual in the population is evaluated; the fitness is usually the value of the objective function in the optimization problem being solved. The more fit individuals are stochastically selected from the current population, and each individual's genome is modified (recombined and possibly randomly mutated) to form a new generation. The new generation of candidate solutions is then used in the next iteration of the algorithm. Commonly, the algorithm terminates when either a maximum number of generations has been produced, or a satisfactory fitness level has been reached for the population.

进化通常从一个随机生成的个体群体开始,是一个迭代过程,每次迭代中的群体称为一代。在每一代人中,每一个人的适合度都会被评估; 适合度通常是解决最佳化问题中目标函数的值。更适合的个体随机地从当前的种群中选出,每个个体的基因组被修改(重组和可能的随机突变) ,形成新一代。然后在算法的下一次迭代中使用新一代候选解。一般情况下,当生成了最大的世代数,或者群体达到了令人满意的适应水平时,算法就会终止。


A typical genetic algorithm requires:

A typical genetic algorithm requires:

一个典型的遗传算法需要:


  1. a genetic representation of the solution domain,
a genetic representation of the solution domain,

解决方案域的遗传表示,

  1. a fitness function to evaluate the solution domain.
a fitness function to evaluate the solution domain.

一个适应度函数来评估解域。


A standard representation of each candidate solution is as an array of bits.模板:Sfn Arrays of other types and structures can be used in essentially the same way. The main property that makes these genetic representations convenient is that their parts are easily aligned due to their fixed size, which facilitates simple crossover operations. Variable length representations may also be used, but crossover implementation is more complex in this case. Tree-like representations are explored in genetic programming and graph-form representations are explored in evolutionary programming; a mix of both linear chromosomes and trees is explored in gene expression programming.

A standard representation of each candidate solution is as an array of bits. Arrays of other types and structures can be used in essentially the same way. The main property that makes these genetic representations convenient is that their parts are easily aligned due to their fixed size, which facilitates simple crossover operations. Variable length representations may also be used, but crossover implementation is more complex in this case. Tree-like representations are explored in genetic programming and graph-form representations are explored in evolutionary programming; a mix of both linear chromosomes and trees is explored in gene expression programming.

每个候选解决方案的标准表示形式是位数组。其他类型和结构的数组可以以基本相同的方式使用。使这些遗传表示方便的主要特性是,由于它们的大小固定,它们的部分很容易对齐,这有利于简单的交叉操作。可变长度表示也可以使用,但交叉实现在这种情况下更复杂。在遗传规划中探讨了树形表示,在进化规划中探讨了图形表示,在基因表达式规划中探讨了线性染色体和树的混合。


Once the genetic representation and the fitness function are defined, a GA proceeds to initialize a population of solutions and then to improve it through repetitive application of the mutation, crossover, inversion and selection operators.

Once the genetic representation and the fitness function are defined, a GA proceeds to initialize a population of solutions and then to improve it through repetitive application of the mutation, crossover, inversion and selection operators.

一旦遗传表示和适应度函数被定义,一个遗传算法进行初始化一个解的种群,然后通过变异、交叉、反演和选择算子的重复应用来改进它。


Initialization

The population size depends on the nature of the problem, but typically contains several hundreds or thousands of possible solutions. Often, the initial population is generated randomly, allowing the entire range of possible solutions (the search space). Occasionally, the solutions may be "seeded" in areas where optimal solutions are likely to be found.

The population size depends on the nature of the problem, but typically contains several hundreds or thousands of possible solutions. Often, the initial population is generated randomly, allowing the entire range of possible solutions (the search space). Occasionally, the solutions may be "seeded" in areas where optimal solutions are likely to be found.

种群大小取决于问题的性质,但通常包含几百个或几千个可能的解决方案。通常,初始种群是随机生成的,允许所有可能的解(搜索空间)。有时候,解决方案可能会在有可能找到最佳解决方案的地区被“播种”。


Selection

During each successive generation, a portion of the existing population is selected to breed a new generation. Individual solutions are selected through a fitness-based process, where fitter solutions (as measured by a fitness function) are typically more likely to be selected. Certain selection methods rate the fitness of each solution and preferentially select the best solutions. Other methods rate only a random sample of the population, as the former process may be very time-consuming.

During each successive generation, a portion of the existing population is selected to breed a new generation. Individual solutions are selected through a fitness-based process, where fitter solutions (as measured by a fitness function) are typically more likely to be selected. Certain selection methods rate the fitness of each solution and preferentially select the best solutions. Other methods rate only a random sample of the population, as the former process may be very time-consuming.

在每一个连续的世代中,现有种群的一部分被选中来繁殖新的一代。个人的解决方案是通过基于适应度的过程来选择的,在这个过程中,通常更有可能选择适应度较高的解决方案(用适应度函数来衡量)。某些选择方法评价每个解的适合度,并优先选择最佳解。其他方法只对总体的随机样本进行评分,因为前一个过程可能非常耗时。


The fitness function is defined over the genetic representation and measures the quality of the represented solution. The fitness function is always problem dependent. For instance, in the knapsack problem one wants to maximize the total value of objects that can be put in a knapsack of some fixed capacity. A representation of a solution might be an array of bits, where each bit represents a different object, and the value of the bit (0 or 1) represents whether or not the object is in the knapsack. Not every such representation is valid, as the size of objects may exceed the capacity of the knapsack. The fitness of the solution is the sum of values of all objects in the knapsack if the representation is valid, or 0 otherwise.

The fitness function is defined over the genetic representation and measures the quality of the represented solution. The fitness function is always problem dependent. For instance, in the knapsack problem one wants to maximize the total value of objects that can be put in a knapsack of some fixed capacity. A representation of a solution might be an array of bits, where each bit represents a different object, and the value of the bit (0 or 1) represents whether or not the object is in the knapsack. Not every such representation is valid, as the size of objects may exceed the capacity of the knapsack. The fitness of the solution is the sum of values of all objects in the knapsack if the representation is valid, or 0 otherwise.

在遗传表示上定义适应度函数,并度量表示解的质量。适应度函数总是依赖于问题。例如,在背包问题中,人们希望最大化可以放入某个固定容量背包的物品的总价值。解决方案的表示形式可以是一个位数组,其中每个位表示一个不同的对象,位(0或1)的值表示对象是否在背包中。并非所有这样的表示都是有效的,因为对象的大小可能超过背包的容量。解的适应性是背包中所有对象的值之和,如果表示是有效的,否则为0。


In some problems, it is hard or even impossible to define the fitness expression; in these cases, a simulation may be used to determine the fitness function value of a phenotype (e.g. computational fluid dynamics is used to determine the air resistance of a vehicle whose shape is encoded as the phenotype), or even interactive genetic algorithms are used.

In some problems, it is hard or even impossible to define the fitness expression; in these cases, a simulation may be used to determine the fitness function value of a phenotype (e.g. computational fluid dynamics is used to determine the air resistance of a vehicle whose shape is encoded as the phenotype), or even interactive genetic algorithms are used.

在某些问题中,很难甚至不可能定义适应度表达式; 在这些情况下,模拟可以用来确定表型的适应度函数值(例如:。计算流体力学是用来确定车辆的空气阻力,车辆的形状被编码为显型) ,甚至交互式遗传算法被使用。


Genetic operators


The next step is to generate a second generation population of solutions from those selected through a combination of genetic operators: crossover (also called recombination), and mutation.

The next step is to generate a second generation population of solutions from those selected through a combination of genetic operators: crossover (also called recombination), and mutation.

下一步是通过遗传操作的组合: 交叉(也称为重组)和变异,从那些被选中的解决方案中生成第二代种群。


For each new solution to be produced, a pair of "parent" solutions is selected for breeding from the pool selected previously. By producing a "child" solution using the above methods of crossover and mutation, a new solution is created which typically shares many of the characteristics of its "parents". New parents are selected for each new child, and the process continues until a new population of solutions of appropriate size is generated.

For each new solution to be produced, a pair of "parent" solutions is selected for breeding from the pool selected previously. By producing a "child" solution using the above methods of crossover and mutation, a new solution is created which typically shares many of the characteristics of its "parents". New parents are selected for each new child, and the process continues until a new population of solutions of appropriate size is generated.

对于生产的每一种新溶液,都会从以前选择的池塘中选择一对“亲本”溶液进行育种。通过使用上述交叉和变异方法生成“子”解决方案,创建了一个新的解决方案,该解决方案通常具有其“父母”的许多特征。为每个新的子女选择新的父母,这一过程将继续下去,直到生成具有适当规模的新解决方案为止。

Although reproduction methods that are based on the use of two parents are more "biology inspired", some research[1][2] suggests that more than two "parents" generate higher quality chromosomes.

Although reproduction methods that are based on the use of two parents are more "biology inspired", some research suggests that more than two "parents" generate higher quality chromosomes.

虽然基于双亲的繁殖方法更具有“生物学启发性” ,但一些研究表明,超过两个“双亲”可以产生更高质量的染色体。


These processes ultimately result in the next generation population of chromosomes that is different from the initial generation. Generally, the average fitness will have increased by this procedure for the population, since only the best organisms from the first generation are selected for breeding, along with a small proportion of less fit solutions. These less fit solutions ensure genetic diversity within the genetic pool of the parents and therefore ensure the genetic diversity of the subsequent generation of children.

These processes ultimately result in the next generation population of chromosomes that is different from the initial generation. Generally, the average fitness will have increased by this procedure for the population, since only the best organisms from the first generation are selected for breeding, along with a small proportion of less fit solutions. These less fit solutions ensure genetic diversity within the genetic pool of the parents and therefore ensure the genetic diversity of the subsequent generation of children.

这些过程最终导致下一代染色体群体不同于最初的一代。一般来说,这个过程会增加种群的平均适应度,因为只有第一代中最好的生物体被选中进行繁殖,同时还有一小部分不太适应的溶液。这些不太适合的解决方案确保了父母基因库中的遗传多样性,从而确保了下一代子女的遗传多样性。


Opinion is divided over the importance of crossover versus mutation. There are many references in Fogel (2006) that support the importance of mutation-based search.

Opinion is divided over the importance of crossover versus mutation. There are many references in Fogel (2006) that support the importance of mutation-based search.

关于交叉与变异的重要性,人们意见不一。Fogel (2006)中有许多参考文献支持基于变异的搜索的重要性。


Although crossover and mutation are known as the main genetic operators, it is possible to use other operators such as regrouping, colonization-extinction, or migration in genetic algorithms.[citation needed]

Although crossover and mutation are known as the main genetic operators, it is possible to use other operators such as regrouping, colonization-extinction, or migration in genetic algorithms.

虽然交叉和变异算子是遗传算法的主要算子,但是在遗传算法中也可以使用其他算子,如重新组合、定殖-灭绝或迁移。


It is worth tuning parameters such as the mutation probability, crossover probability and population size to find reasonable settings for the problem class being worked on. A very small mutation rate may lead to genetic drift (which is non-ergodic in nature). A recombination rate that is too high may lead to premature convergence of the genetic algorithm. A mutation rate that is too high may lead to loss of good solutions, unless elitist selection is employed. An adequate population size ensures sufficient genetic diversity for the problem at hand, but can lead to a waste of computational resources if set to a value larger than required.

It is worth tuning parameters such as the mutation probability, crossover probability and population size to find reasonable settings for the problem class being worked on. A very small mutation rate may lead to genetic drift (which is non-ergodic in nature). A recombination rate that is too high may lead to premature convergence of the genetic algorithm. A mutation rate that is too high may lead to loss of good solutions, unless elitist selection is employed. An adequate population size ensures sufficient genetic diversity for the problem at hand, but can lead to a waste of computational resources if set to a value larger than required.

值得调整的参数,如变异概率,交叉概率和种群大小,以找到合理的设置正在进行的问题类。一个非常小的突变率可能会导致遗传漂变(在自然界中是非遍历性的)。过高的重组率可能导致遗传算法的早熟收敛。如果突变率过高,可能会导致失去好的解决方案,除非使用精英选择。适当的种群规模可以确保目前问题的足够遗传多样性,但如果设置的值大于所需值,则可能导致计算资源的浪费。


Heuristics

In addition to the main operators above, other heuristics may be employed to make the calculation faster or more robust. The speciation heuristic penalizes crossover between candidate solutions that are too similar; this encourages population diversity and helps prevent premature convergence to a less optimal solution.[3][4]

In addition to the main operators above, other heuristics may be employed to make the calculation faster or more robust. The speciation heuristic penalizes crossover between candidate solutions that are too similar; this encourages population diversity and helps prevent premature convergence to a less optimal solution.

除了上面的主要运算符之外,还可以使用其他启发式算法来使计算更快或更健壮。形态启发式惩罚候选解之间的交叉太相似,这鼓励种群多样性,并有助于防止早熟收敛到一个不太优化的解决方案。


Termination

This generational process is repeated until a termination condition has been reached. Common terminating conditions are:

This generational process is repeated until a termination condition has been reached. Common terminating conditions are:

此分代过程将重复,直到达到终止条件为止。常见的终止条件是:


  • A solution is found that satisfies minimum criteria
  • Fixed number of generations reached
  • Allocated budget (computation time/money) reached
  • The highest ranking solution's fitness is reaching or has reached a plateau such that successive iterations no longer produce better results
  • Manual inspection
  • Combinations of the above


The building block hypothesis

Genetic algorithms are simple to implement, but their behavior is difficult to understand. In particular, it is difficult to understand why these algorithms frequently succeed at generating solutions of high fitness when applied to practical problems. The building block hypothesis (BBH) consists of:

Genetic algorithms are simple to implement, but their behavior is difficult to understand. In particular, it is difficult to understand why these algorithms frequently succeed at generating solutions of high fitness when applied to practical problems. The building block hypothesis (BBH) consists of:

遗传算法实现起来很简单,但是它们的行为却很难理解。特别是,很难理解为什么这些算法在应用于实际问题时经常能够成功地产生高适应度的解决方案。构件假设包括:


  1. A description of a heuristic that performs adaptation by identifying and recombining "building blocks", i.e. low order, low defining-length schemata with above average fitness.
A description of a heuristic that performs adaptation by identifying and recombining "building blocks", i.e. low order, low defining-length schemata with above average fitness.

通过识别和重新组合“构建块”来执行自适应的启发式算法的描述。低阶、低定义长度的适应度高于平均水平的模式。

  1. A hypothesis that a genetic algorithm performs adaptation by implicitly and efficiently implementing this heuristic.
A hypothesis that a genetic algorithm performs adaptation by implicitly and efficiently implementing this heuristic.

一种假设,即遗传算法通过隐含和有效地实现这种启发式算法来实现自适应。


Goldberg describes the heuristic as follows:

Goldberg describes the heuristic as follows:

戈德堡对这种启发式做了如下描述:


"Short, low order, and highly fit schemata are sampled, recombined [crossed over], and resampled to form strings of potentially higher fitness. In a way, by working with these particular schemata [the building blocks], we have reduced the complexity of our problem; instead of building high-performance strings by trying every conceivable combination, we construct better and better strings from the best partial solutions of past samplings.

"Short, low order, and highly fit schemata are sampled, recombined [crossed over], and resampled to form strings of potentially higher fitness. In a way, by working with these particular schemata [the building blocks], we have reduced the complexity of our problem; instead of building high-performance strings by trying every conceivable combination, we construct better and better strings from the best partial solutions of past samplings.

“对短的、低阶的和高度适合的模式进行取样、重组[交叉] ,并重新取样以形成具有潜在更高适合度的字符串。在某种程度上,通过使用这些特定的模式[构建块] ,我们降低了问题的复杂性; 我们不是通过尝试每一种可以想象的组合来构建高性能字符串,而是从过去采样的最佳部分解决方案中构建越来越好的字符串。


"Because highly fit schemata of low defining length and low order play such an important role in the action of genetic algorithms, we have already given them a special name: building blocks. Just as a child creates magnificent fortresses through the arrangement of simple blocks of wood, so does a genetic algorithm seek near optimal performance through the juxtaposition of short, low-order, high-performance schemata, or building blocks."模板:Sfn

"Because highly fit schemata of low defining length and low order play such an important role in the action of genetic algorithms, we have already given them a special name: building blocks. Just as a child creates magnificent fortresses through the arrangement of simple blocks of wood, so does a genetic algorithm seek near optimal performance through the juxtaposition of short, low-order, high-performance schemata, or building blocks."

“由于低定义长度和低次序的高度拟合图式在遗传算法中起着如此重要的作用,我们已经给它们起了一个特殊的名字: 积木。就像一个孩子通过简单的木块排列来建造宏伟的堡垒一样,遗传算法也通过并列短小、低阶、高性能的图式或积木来寻求接近最优的性能。”


Despite the lack of consensus regarding the validity of the building-block hypothesis, it has been consistently evaluated and used as reference throughout the years. Many estimation of distribution algorithms, for example, have been proposed in an attempt to provide an environment in which the hypothesis would hold.[5][6] Although good results have been reported for some classes of problems, skepticism concerning the generality and/or practicality of the building-block hypothesis as an explanation for GAs efficiency still remains. Indeed, there is a reasonable amount of work that attempts to understand its limitations from the perspective of estimation of distribution algorithms.[7][8][9]

Despite the lack of consensus regarding the validity of the building-block hypothesis, it has been consistently evaluated and used as reference throughout the years. Many estimation of distribution algorithms, for example, have been proposed in an attempt to provide an environment in which the hypothesis would hold. Although good results have been reported for some classes of problems, skepticism concerning the generality and/or practicality of the building-block hypothesis as an explanation for GAs efficiency still remains. Indeed, there is a reasonable amount of work that attempts to understand its limitations from the perspective of estimation of distribution algorithms.

尽管对于构件假设的有效性缺乏共识,但多年来一直对其进行评价,并将其作为参考。例如,已经提出了许多分布估计算法,试图提供一个假设成立的环境。虽然在某些类别的问题上取得了良好的结果,但是对于构件假设作为气体效率解释的普遍性和/或实用性仍然存在怀疑。事实上,有相当数量的工作试图从分布算法估计的角度来理解其局限性。


Limitations

There are limitations of the use of a genetic algorithm compared to alternative optimization algorithms:

There are limitations of the use of a genetic algorithm compared to alternative optimization algorithms:

与其他优化算法相比,使用遗传算法有一些局限性:


  • Repeated fitness function evaluation for complex problems is often the most prohibitive and limiting segment of artificial evolutionary algorithms. Finding the optimal solution to complex high-dimensional, multimodal problems often requires very expensive fitness function evaluations. In real world problems such as structural optimization problems, a single function evaluation may require several hours to several days of complete simulation. Typical optimization methods cannot deal with such types of problem. In this case, it may be necessary to forgo an exact evaluation and use an approximated fitness that is computationally efficient. It is apparent that amalgamation of approximate models may be one of the most promising approaches to convincingly use GA to solve complex real life problems.
  • Genetic algorithms do not scale well with complexity. That is, where the number of elements which are exposed to mutation is large there is often an exponential increase in search space size. This makes it extremely difficult to use the technique on problems such as designing an engine, a house or a plane. In order to make such problems tractable to evolutionary search, they must be broken down into the simplest representation possible. Hence we typically see evolutionary algorithms encoding designs for fan blades instead of engines, building shapes instead of detailed construction plans, and airfoils instead of whole aircraft designs. The second problem of complexity is the issue of how to protect parts that have evolved to represent good solutions from further destructive mutation, particularly when their fitness assessment requires them to combine well with other parts.
  • The "better" solution is only in comparison to other solutions. As a result, the stop criterion is not clear in every problem.
  • In many problems, GAs have a tendency to converge towards local optima or even arbitrary points rather than the global optimum of the problem. This means that it does not "know how" to sacrifice short-term fitness to gain longer-term fitness. The likelihood of this occurring depends on the shape of the fitness landscape: certain problems may provide an easy ascent towards a global optimum, others may make it easier for the function to find the local optima. This problem may be alleviated by using a different fitness function, increasing the rate of mutation, or by using selection techniques that maintain a diverse population of solutions,[10] although the No Free Lunch theorem[11] proves that there is no general solution to this problem. A common technique to maintain diversity is to impose a "niche penalty", wherein, any group of individuals of sufficient similarity (niche radius) have a penalty added, which will reduce the representation of that group in subsequent generations, permitting other (less similar) individuals to be maintained in the population. This trick, however, may not be effective, depending on the landscape of the problem. Another possible technique would be to simply replace part of the population with randomly generated individuals, when most of the population is too similar to each other. Diversity is important in genetic algorithms (and genetic programming) because crossing over a homogeneous population does not yield new solutions. In evolution strategies and evolutionary programming, diversity is not essential because of a greater reliance on mutation.
  • Operating on dynamic data sets is difficult, as genomes begin to converge early on towards solutions which may no longer be valid for later data. Several methods have been proposed to remedy this by increasing genetic diversity somehow and preventing early convergence, either by increasing the probability of mutation when the solution quality drops (called triggered hypermutation), or by occasionally introducing entirely new, randomly generated elements into the gene pool (called random immigrants). Again, evolution strategies and evolutionary programming can be implemented with a so-called "comma strategy" in which parents are not maintained and new parents are selected only from offspring. This can be more effective on dynamic problems.
  • GAs cannot effectively solve problems in which the only fitness measure is a single right/wrong measure (like decision problems), as there is no way to converge on the solution (no hill to climb). In these cases, a random search may find a solution as quickly as a GA. However, if the situation allows the success/failure trial to be repeated giving (possibly) different results, then the ratio of successes to failures provides a suitable fitness measure.


Variants

Chromosome representation

The simplest algorithm represents each chromosome as a bit string. Typically, numeric parameters can be represented by integers, though it is possible to use floating point representations. The floating point representation is natural to evolution strategies and evolutionary programming. The notion of real-valued genetic algorithms has been offered but is really a misnomer because it does not really represent the building block theory that was proposed by John Henry Holland in the 1970s. This theory is not without support though, based on theoretical and experimental results (see below). The basic algorithm performs crossover and mutation at the bit level. Other variants treat the chromosome as a list of numbers which are indexes into an instruction table, nodes in a linked list, hashes, objects, or any other imaginable data structure. Crossover and mutation are performed so as to respect data element boundaries. For most data types, specific variation operators can be designed. Different chromosomal data types seem to work better or worse for different specific problem domains.

The simplest algorithm represents each chromosome as a bit string. Typically, numeric parameters can be represented by integers, though it is possible to use floating point representations. The floating point representation is natural to evolution strategies and evolutionary programming. The notion of real-valued genetic algorithms has been offered but is really a misnomer because it does not really represent the building block theory that was proposed by John Henry Holland in the 1970s. This theory is not without support though, based on theoretical and experimental results (see below). The basic algorithm performs crossover and mutation at the bit level. Other variants treat the chromosome as a list of numbers which are indexes into an instruction table, nodes in a linked list, hashes, objects, or any other imaginable data structure. Crossover and mutation are performed so as to respect data element boundaries. For most data types, specific variation operators can be designed. Different chromosomal data types seem to work better or worse for different specific problem domains.

最简单的算法将每个染色体表示为一个位串。通常,数值参数可以由整数表示,但也可以使用浮点表示。浮点表示是进化策略和进化规划的自然结果。实值遗传算法的概念已经提出,但实际上用词不当,因为它并不真正代表约翰 · 亨利 · 霍兰德在上世纪70年代提出的构件理论。这个理论不是没有支持,但是,基于理论和实验结果(见下文)。基本算法在比特级进行交叉和变异操作。其他变体将染色体视为指令表、链表中的节点、散列、对象或任何其他可以想象的数据结构的索引的数列。交叉和变异的执行是为了尊重数据元素的边界。对于大多数数据类型,可以设计特定的变化操作符。不同的染色体数据类型似乎对不同的特定问题域有更好或更坏的作用。


When bit-string representations of integers are used, Gray coding is often employed. In this way, small changes in the integer can be readily affected through mutations or crossovers. This has been found to help prevent premature convergence at so-called Hamming walls, in which too many simultaneous mutations (or crossover events) must occur in order to change the chromosome to a better solution.

When bit-string representations of integers are used, Gray coding is often employed. In this way, small changes in the integer can be readily affected through mutations or crossovers. This has been found to help prevent premature convergence at so-called Hamming walls, in which too many simultaneous mutations (or crossover events) must occur in order to change the chromosome to a better solution.

当使用整数的位串表示时,通常采用格雷编码。通过这种方式,整数中的微小变化很容易通过突变或交叉而受到影响。已经发现,这有助于防止所谓的汉明壁上的过早收敛,在汉明壁上必须发生太多的同时突变(或交叉事件) ,以便将染色体改变成一个更好的解决方案。


Other approaches involve using arrays of real-valued numbers instead of bit strings to represent chromosomes. Results from the theory of schemata suggest that in general the smaller the alphabet, the better the performance, but it was initially surprising to researchers that good results were obtained from using real-valued chromosomes. This was explained as the set of real values in a finite population of chromosomes as forming a virtual alphabet (when selection and recombination are dominant) with a much lower cardinality than would be expected from a floating point representation.[12][13]

Other approaches involve using arrays of real-valued numbers instead of bit strings to represent chromosomes. Results from the theory of schemata suggest that in general the smaller the alphabet, the better the performance, but it was initially surprising to researchers that good results were obtained from using real-valued chromosomes. This was explained as the set of real values in a finite population of chromosomes as forming a virtual alphabet (when selection and recombination are dominant) with a much lower cardinality than would be expected from a floating point representation.

其他方法包括使用实数数组代替位字符串来表示染色体。图式理论的结果表明,一般来说,字母表越小,表现越好,但最初令研究人员感到惊讶的是,使用实值染色体取得了良好的结果。这被解释为在有限的染色体群中形成一个虚拟字母表(当选择和重组占主导地位时)的一组真实值,其基数比浮点表示法预期的低得多。


An expansion of the Genetic Algorithm accessible problem domain can be obtained through more complex encoding of the solution pools by concatenating several types of heterogenously encoded genes into one chromosome.[14] This particular approach allows for solving optimization problems that require vastly disparate definition domains for the problem parameters. For instance, in problems of cascaded controller tuning, the internal loop controller structure can belong to a conventional regulator of three parameters, whereas the external loop could implement a linguistic controller (such as a fuzzy system) which has an inherently different description. This particular form of encoding requires a specialized crossover mechanism that recombines the chromosome by section, and it is a useful tool for the modelling and simulation of complex adaptive systems, especially evolution processes.

An expansion of the Genetic Algorithm accessible problem domain can be obtained through more complex encoding of the solution pools by concatenating several types of heterogenously encoded genes into one chromosome. This particular approach allows for solving optimization problems that require vastly disparate definition domains for the problem parameters. For instance, in problems of cascaded controller tuning, the internal loop controller structure can belong to a conventional regulator of three parameters, whereas the external loop could implement a linguistic controller (such as a fuzzy system) which has an inherently different description. This particular form of encoding requires a specialized crossover mechanism that recombines the chromosome by section, and it is a useful tool for the modelling and simulation of complex adaptive systems, especially evolution processes.

通过将多种类型的异源编码基因串联到一个染色体中,可以对解决池进行更复杂的编码,从而扩展遗传算法可访问的问题域。这种特殊的方法允许解决需要根据问题参数完全不同的定义域的优化问题。例如,在级联控制器的整定问题中,内环控制器结构可能属于传统的三参数调节器,而外环控制器可以实现一个具有内在不同描述的语言控制器(如模糊系统)。这种特殊形式的编码需要一个专门的交叉机制,按部分重组染色体,这是一个有用的工具,建模和模拟复杂的自适应系统,特别是进化过程。


Elitism

A practical variant of the general process of constructing a new population is to allow the best organism(s) from the current generation to carry over to the next, unaltered. This strategy is known as elitist selection and guarantees that the solution quality obtained by the GA will not decrease from one generation to the next.[15]

A practical variant of the general process of constructing a new population is to allow the best organism(s) from the current generation to carry over to the next, unaltered. This strategy is known as elitist selection and guarantees that the solution quality obtained by the GA will not decrease from one generation to the next.

构建一个新种群的一般过程的一个实际变体是允许当代最好的生物体原封不动地传递到下一代。这种策略被称为精英选择策略,保证遗传算法得到的解质量不会从一代下降到下一代。


Parallel implementations

Parallel implementations of genetic algorithms come in two flavors. Coarse-grained parallel genetic algorithms assume a population on each of the computer nodes and migration of individuals among the nodes. Fine-grained parallel genetic algorithms assume an individual on each processor node which acts with neighboring individuals for selection and reproduction.

Parallel implementations of genetic algorithms come in two flavors. Coarse-grained parallel genetic algorithms assume a population on each of the computer nodes and migration of individuals among the nodes. Fine-grained parallel genetic algorithms assume an individual on each processor node which acts with neighboring individuals for selection and reproduction.

遗传算法的并行实现有两种形式。粗粒度并行遗传算法假定每个计算机节点上有一个种群,个体在节点之间迁移。细粒度并行遗传算法假设每个处理器节点上有一个个体,它与相邻的个体一起进行选择和繁殖。

Other variants, like genetic algorithms for online optimization problems, introduce time-dependence or noise in the fitness function.

Other variants, like genetic algorithms for online optimization problems, introduce time-dependence or noise in the fitness function.

其他变种,如在线优化问题的遗传算法,在适应度函数中引入时间依赖性或噪声。


Adaptive GAs

Genetic algorithms with adaptive parameters (adaptive genetic algorithms, AGAs) is another significant and promising variant of genetic algorithms. The probabilities of crossover (pc) and mutation (pm) greatly determine the degree of solution accuracy and the convergence speed that genetic algorithms can obtain. Instead of using fixed values of pc and pm, AGAs utilize the population information in each generation and adaptively adjust the pc and pm in order to maintain the population diversity as well as to sustain the convergence capacity. In AGA (adaptive genetic algorithm),[16] the adjustment of pc and pm depends on the fitness values of the solutions. In CAGA (clustering-based adaptive genetic algorithm),[17] through the use of clustering analysis to judge the optimization states of the population, the adjustment of pc and pm depends on these optimization states.

Genetic algorithms with adaptive parameters (adaptive genetic algorithms, AGAs) is another significant and promising variant of genetic algorithms. The probabilities of crossover (pc) and mutation (pm) greatly determine the degree of solution accuracy and the convergence speed that genetic algorithms can obtain. Instead of using fixed values of pc and pm, AGAs utilize the population information in each generation and adaptively adjust the pc and pm in order to maintain the population diversity as well as to sustain the convergence capacity. In AGA (adaptive genetic algorithm), the adjustment of pc and pm depends on the fitness values of the solutions. In CAGA (clustering-based adaptive genetic algorithm), through the use of clustering analysis to judge the optimization states of the population, the adjustment of pc and pm depends on these optimization states.

具有自适应参数的遗传算法(AGAs)是遗传算法的另一个重要且有前途的变体。交叉概率和变异概率决定了遗传算法的求解精度和收敛速度。自适应遗传算法利用每一代的种群信息,自适应地调整种群的种群多样性,以保持种群的收敛能力,而不是采用固定的种群多样性和种群多样性。在自适应遗传算法中,pc 和 pm 的调整取决于解的适应值。在基于聚类的自适应遗传算法(CAGA)中,通过聚类分析来判断种群的优化状态,并根据这些优化状态来调整种群的个体数和个体数。

It can be quite effective to combine GA with other optimization methods. GA tends to be quite good at finding generally good global solutions, but quite inefficient at finding the last few mutations to find the absolute optimum. Other techniques (such as simple hill climbing) are quite efficient at finding absolute optimum in a limited region. Alternating GA and hill climbing can improve the efficiency of GA[citation needed] while overcoming the lack of robustness of hill climbing.

It can be quite effective to combine GA with other optimization methods. GA tends to be quite good at finding generally good global solutions, but quite inefficient at finding the last few mutations to find the absolute optimum. Other techniques (such as simple hill climbing) are quite efficient at finding absolute optimum in a limited region. Alternating GA and hill climbing can improve the efficiency of GA while overcoming the lack of robustness of hill climbing.

将遗传算法与其他优化方法相结合可以取得很好的效果。遗传算法往往很擅长寻找一般良好的全局解决方案,但是在寻找最后几个突变来寻找绝对最优解时效率非常低。其他技术(例如简单的爬山)在有限的区域内找到绝对的最佳状态是相当有效的。交替遗传算法和爬山算法在克服爬山算法鲁棒性不足的同时,提高了遗传算法的效率。


This means that the rules of genetic variation may have a different meaning in the natural case. For instance – provided that steps are stored in consecutive order – crossing over may sum a number of steps from maternal DNA adding a number of steps from paternal DNA and so on. This is like adding vectors that more probably may follow a ridge in the phenotypic landscape. Thus, the efficiency of the process may be increased by many orders of magnitude. Moreover, the inversion operator has the opportunity to place steps in consecutive order or any other suitable order in favour of survival or efficiency.[18]

This means that the rules of genetic variation may have a different meaning in the natural case. For instance – provided that steps are stored in consecutive order – crossing over may sum a number of steps from maternal DNA adding a number of steps from paternal DNA and so on. This is like adding vectors that more probably may follow a ridge in the phenotypic landscape. Thus, the efficiency of the process may be increased by many orders of magnitude. Moreover, the inversion operator has the opportunity to place steps in consecutive order or any other suitable order in favour of survival or efficiency.

这意味着在自然情况下,遗传变异的规则可能有不同的含义。例如---- 假设步骤是以连续的顺序存储的---- 交叉可以从母体 DNA 加上从父体 DNA 加上若干步骤等等。这就像在表型景观中添加更可能遵循脊线的载体一样。因此,这个过程的效率可以通过许多数量级来提高。此外,倒置操作者有机会按顺序或任何其他适当顺序排列步骤,以利于生存或效率。


A variation, where the population as a whole is evolved rather than its individual members, is known as gene pool recombination.

A variation, where the population as a whole is evolved rather than its individual members, is known as gene pool recombination.

一种变异,即种群作为一个整体而不是其个体成员进化而来,被称为基因库重组。


A number of variations have been developed to attempt to improve performance of GAs on problems with a high degree of fitness epistasis, i.e. where the fitness of a solution consists of interacting subsets of its variables. Such algorithms aim to learn (before exploiting) these beneficial phenotypic interactions. As such, they are aligned with the Building Block Hypothesis in adaptively reducing disruptive recombination. Prominent examples of this approach include the mGA,[19] GEMGA[20] and LLGA.[21]

A number of variations have been developed to attempt to improve performance of GAs on problems with a high degree of fitness epistasis, i.e. where the fitness of a solution consists of interacting subsets of its variables. Such algorithms aim to learn (before exploiting) these beneficial phenotypic interactions. As such, they are aligned with the Building Block Hypothesis in adaptively reducing disruptive recombination. Prominent examples of this approach include the mGA, GEMGA and LLGA.

为了提高遗传算法在具有高度适应性上位性问题上的性能,已经开发了许多变体。其中解决方案的适应性包括相互作用的子集的变量。这样的算法旨在学习(在开发之前)这些有益的表型互动。因此,它们在自适应地减少分裂重组中与构建块假说一致。这种方法的典型例子包括 mGA、 GEMGA 和 LLGA。


Problem domains

Problems which appear to be particularly appropriate for solution by genetic algorithms include timetabling and scheduling problems, and many scheduling software packages are based on GAs[citation needed]. GAs have also been applied to engineering[22] and to reducing the length of psychological questionnaires.[23] Genetic algorithms are often applied as an approach to solve global optimization problems.

Problems which appear to be particularly appropriate for solution by genetic algorithms include timetabling and scheduling problems, and many scheduling software packages are based on GAs. GAs have also been applied to engineering and to reducing the length of psychological questionnaires. Genetic algorithms are often applied as an approach to solve global optimization problems.

用遗传算法解决的问题包括时间表和调度问题,许多调度软件包都是基于遗传算法的。遗传算法也被应用于工程和减少心理问卷的长度。遗传算法是求解全局最优化问题的常用方法。


As a general rule of thumb genetic algorithms might be useful in problem domains that have a complex fitness landscape as mixing, i.e., mutation in combination with crossover, is designed to move the population away from local optima that a traditional hill climbing algorithm might get stuck in. Observe that commonly used crossover operators cannot change any uniform population. Mutation alone can provide ergodicity of the overall genetic algorithm process (seen as a Markov chain).

As a general rule of thumb genetic algorithms might be useful in problem domains that have a complex fitness landscape as mixing, i.e., mutation in combination with crossover, is designed to move the population away from local optima that a traditional hill climbing algorithm might get stuck in. Observe that commonly used crossover operators cannot change any uniform population. Mutation alone can provide ergodicity of the overall genetic algorithm process (seen as a Markov chain).

作为一般的经验法则,遗传算法可能在具有复杂适应度的问题领域中有用,因为混合---- 也就是说,变异与交叉结合---- 被设计用来使种群远离传统爬山算法可能陷入的局部最优解。观察到常用的交叉算子不能改变任何一致的种群。变异本身可以提供整个遗传算法过程的遍历性(可以看作是一个马尔可夫链)。


Examples of problems solved by genetic algorithms include: mirrors designed to funnel sunlight to a solar collector,[24] antennae designed to pick up radio signals in space,[25] walking methods for computer figures,[26] optimal design of aerodynamic bodies in complex flowfields[27]

Examples of problems solved by genetic algorithms include: mirrors designed to funnel sunlight to a solar collector, antennae designed to pick up radio signals in space, walking methods for computer figures, optimal design of aerodynamic bodies in complex flowfields

遗传算法解决的问题包括: 设计用于将阳光集中到太阳能收集器的镜子,设计用于在太空中接收无线电信号的天线,计算机图形的行走方法,复杂流场中空气动力学物体的优化设计


In his Algorithm Design Manual, Skiena advises against genetic algorithms for any task:

In his Algorithm Design Manual, Skiena advises against genetic algorithms for any task:

在他的算法设计手册中,Skiena 建议任何任务都不要使用遗传算法:


{{blockquote|[I]t is quite unnatural to model applications in terms of genetic operators like mutation and crossover on bit strings. The pseudobiology adds another level of complexity between you and your problem. Second, genetic algorithms take a very long time on nontrivial problems. [...] [T]he analogy with evolution—where significant progress require [sic] millions of years—can be quite appropriate.

[I]t is quite unnatural to model applications in terms of genetic operators like mutation and crossover on bit strings. The pseudobiology adds another level of complexity between you and your problem. Second, genetic algorithms take a very long time on nontrivial problems. [...] [T]he analogy with evolution—where significant progress require [sic] millions of years—can be quite appropriate.

[ i ] t 用遗传操作(比如位串上的变异和交叉)来为应用程序建模是很不自然的。假生物学在你和你的问题之间增加了另一个层次的复杂性。其次,遗传算法在解决重大问题上花费了很长时间。[ ... ][ ... ][ t ]与进化的类比ーー重大进展需要[ sic ]数百万年ーー可能是非常恰当的。

[...]

[...]

[...]

I have never encountered any problem where genetic algorithms seemed to me the right way to attack it. Further, I have never seen any computational results reported using genetic algorithms that have favorably impressed me. Stick to simulated annealing for your heuristic search voodoo needs.

Steven Skiena[28]:267

I have never encountered any problem where genetic algorithms seemed to me the right way to attack it. Further, I have never seen any computational results reported using genetic algorithms that have favorably impressed me. Stick to simulated annealing for your heuristic search voodoo needs.

Steven Skiena

我从来没有遇到过遗传算法似乎是解决问题的正确方法的问题。此外,我从未见过任何使用遗传算法的计算结果给我留下良好印象的报道。坚持使用模拟退火搜索来满足你的启发式搜索巫术需求


History

In 1950, Alan Turing proposed a "learning machine" which would parallel the principles of evolution.[29] Computer simulation of evolution started as early as in 1954 with the work of Nils Aall Barricelli, who was using the computer at the Institute for Advanced Study in Princeton, New Jersey.[30][31] His 1954 publication was not widely noticed. Starting in 1957,[32] the Australian quantitative geneticist Alex Fraser published a series of papers on simulation of artificial selection of organisms with multiple loci controlling a measurable trait. From these beginnings, computer simulation of evolution by biologists became more common in the early 1960s, and the methods were described in books by Fraser and Burnell (1970)[33] and Crosby (1973).[34] Fraser's simulations included all of the essential elements of modern genetic algorithms. In addition, Hans-Joachim Bremermann published a series of papers in the 1960s that also adopted a population of solution to optimization problems, undergoing recombination, mutation, and selection. Bremermann's research also included the elements of modern genetic algorithms.[35] Other noteworthy early pioneers include Richard Friedberg, George Friedman, and Michael Conrad. Many early papers are reprinted by Fogel (1998).[36]

In 1950, Alan Turing proposed a "learning machine" which would parallel the principles of evolution. Computer simulation of evolution started as early as in 1954 with the work of Nils Aall Barricelli, who was using the computer at the Institute for Advanced Study in Princeton, New Jersey. His 1954 publication was not widely noticed. Starting in 1957, the Australian quantitative geneticist Alex Fraser published a series of papers on simulation of artificial selection of organisms with multiple loci controlling a measurable trait. From these beginnings, computer simulation of evolution by biologists became more common in the early 1960s, and the methods were described in books by Fraser and Burnell (1970) and Crosby (1973). Fraser's simulations included all of the essential elements of modern genetic algorithms. In addition, Hans-Joachim Bremermann published a series of papers in the 1960s that also adopted a population of solution to optimization problems, undergoing recombination, mutation, and selection. Bremermann's research also included the elements of modern genetic algorithms. Other noteworthy early pioneers include Richard Friedberg, George Friedman, and Michael Conrad. Many early papers are reprinted by Fogel (1998).

1950年,阿兰 · 图灵提出了一种“学习机器” ,它可以与进化论的原理相平行。进化论的计算机模拟早在1954年 Nils Aall Barricelli 的工作就开始了,他当时正在新泽西州普林斯顿的高级研究所使用计算机。他1954年的出版物没有引起广泛的注意。从1957年开始,澳大利亚定量遗传学家亚历克斯 · 弗雷泽发表了一系列关于模拟具有控制可测性状的多个位点的有机体的人工选择的论文。从这些开始,生物学家的进化计算机模拟在20世纪60年代早期变得更加普遍,弗雷泽和伯内尔(1970)和克罗斯比(1973)在书中描述了这些方法。弗雷泽的模拟包括了现代遗传算法的所有基本元素。此外,汉斯-约阿希姆 · 布雷默曼在20世纪60年代发表了一系列论文,这些论文也采用了解决优化问题的群体,经历了重组、突变和选择。布雷默曼的研究还包括了现代遗传算法的元素。其他值得注意的早期先驱者包括理查德 · 弗里德伯格、乔治 · 弗里德曼和迈克尔 · 康拉德。许多早期的论文由福格尔(1998)转载。


Although Barricelli, in work he reported in 1963, had simulated the evolution of ability to play a simple game,[37] artificial evolution only became a widely recognized optimization method as a result of the work of Ingo Rechenberg and Hans-Paul Schwefel in the 1960s and early 1970s – Rechenberg's group was able to solve complex engineering problems through evolution strategies.[38][39][40][41] Another approach was the evolutionary programming technique of Lawrence J. Fogel, which was proposed for generating artificial intelligence. Evolutionary programming originally used finite state machines for predicting environments, and used variation and selection to optimize the predictive logics. Genetic algorithms in particular became popular through the work of John Holland in the early 1970s, and particularly his book Adaptation in Natural and Artificial Systems (1975). His work originated with studies of cellular automata, conducted by Holland and his students at the University of Michigan. Holland introduced a formalized framework for predicting the quality of the next generation, known as Holland's Schema Theorem. Research in GAs remained largely theoretical until the mid-1980s, when The First International Conference on Genetic Algorithms was held in Pittsburgh, Pennsylvania.

Although Barricelli, in work he reported in 1963, had simulated the evolution of ability to play a simple game, artificial evolution only became a widely recognized optimization method as a result of the work of Ingo Rechenberg and Hans-Paul Schwefel in the 1960s and early 1970s – Rechenberg's group was able to solve complex engineering problems through evolution strategies. Another approach was the evolutionary programming technique of Lawrence J. Fogel, which was proposed for generating artificial intelligence. Evolutionary programming originally used finite state machines for predicting environments, and used variation and selection to optimize the predictive logics. Genetic algorithms in particular became popular through the work of John Holland in the early 1970s, and particularly his book Adaptation in Natural and Artificial Systems (1975). His work originated with studies of cellular automata, conducted by Holland and his students at the University of Michigan. Holland introduced a formalized framework for predicting the quality of the next generation, known as Holland's Schema Theorem. Research in GAs remained largely theoretical until the mid-1980s, when The First International Conference on Genetic Algorithms was held in Pittsburgh, Pennsylvania.

虽然 Barricelli 在1963年的报告中模拟了玩简单游戏的能力的进化,但是人工进化只是在20世纪60年代和70年代早期 Ingo Rechenberg 和 Hans-Paul Schwefel 的工作中才成为一种广泛认可的优化方法—— Rechenberg 的团队能够通过进化策略来解决复杂的工程问题。另一种方法是劳伦斯 · j · 福格尔的进化规划技术,该技术是为产生人工智能而提出的。进化规划最初使用有限状态机来预测环境,并使用变异和选择来优化预测逻辑。在20世纪70年代早期,约翰 · 霍兰的工作,特别是他的《自然和人工系统中的适应》(1975) ,使得遗传算法变得特别流行。他的工作起源于对细胞自动机的研究,由 Holland 和他在密歇根大学的学生负责。霍兰德介绍了一个预测下一代产品质量的形式化框架,称为霍兰德模式定理。遗传算法的研究一直停留在理论上,直到20世纪80年代中期,第一次遗传算法国际会议在宾夕法尼亚州匹兹堡举行。


Commercial products

In the late 1980s, General Electric started selling the world's first genetic algorithm product, a mainframe-based toolkit designed for industrial processes.[42]

In the late 1980s, General Electric started selling the world's first genetic algorithm product, a mainframe-based toolkit designed for industrial processes.

20世纪80年代末,通用电气开始销售世界上第一个遗传算法产品,这是一个基于大型机的工具包,专为工业生产过程而设计。

In 1989, Axcelis, Inc. released Evolver, the world's first commercial GA product for desktop computers. The New York Times technology writer John Markoff wrote[43] about Evolver in 1990, and it remained the only interactive commercial genetic algorithm until 1995.[44] Evolver was sold to Palisade in 1997, translated into several languages, and is currently in its 6th version.[45] Since the 1990s, MATLAB has built in three derivative-free optimization heuristic algorithms (simulated annealing, particle swarm optimization, genetic algorithm) and two direct search algorithms (simplex search, pattern search).[46]

In 1989, Axcelis, Inc. released Evolver, the world's first commercial GA product for desktop computers. The New York Times technology writer John Markoff wrote about Evolver in 1990, and it remained the only interactive commercial genetic algorithm until 1995. Evolver was sold to Palisade in 1997, translated into several languages, and is currently in its 6th version. Since the 1990s, MATLAB has built in three derivative-free optimization heuristic algorithms (simulated annealing, particle swarm optimization, genetic algorithm) and two direct search algorithms (simplex search, pattern search).

1989年,Axcelis 公司发布了 Evolver,这是世界上第一款面向台式计算机的商用 GA 产品。1990年,《纽约时报》技术专栏作家约翰 · 马尔科夫写了关于 Evolver 的文章,直到1995年,Evolver 一直是唯一的交互式商业遗传算法。1997年出售给帕利塞德,被翻译成多种语言,目前是第6个版本。自20世纪90年代以来,MATLAB 已经内置了3个无导数的启发式算法(模拟退火、粒子群优化、遗传算法)和2个直接搜索算法(单纯形搜索、模式搜索)。


Related techniques


Parent fields

Genetic algorithms are a sub-field:

Genetic algorithms are a sub-field:

遗传算法是一个子领域:


Related fields

Evolutionary algorithms

模板:More citations needed section

Evolutionary algorithms is a sub-field of evolutionary computing.

Evolutionary algorithms is a sub-field of evolutionary computing.

进化算法是进化计算的一个分支。


  • Evolution strategies (ES, see Rechenberg, 1994) evolve individuals by means of mutation and intermediate or discrete recombination. ES algorithms are designed particularly to solve problems in the real-value domain.[47] They use self-adaptation to adjust control parameters of the search. De-randomization of self-adaptation has led to the contemporary Covariance Matrix Adaptation Evolution Strategy (CMA-ES).
  • Evolutionary programming (EP) involves populations of solutions with primarily mutation and selection and arbitrary representations. They use self-adaptation to adjust parameters, and can include other variation operations such as combining information from multiple parents.
  • Estimation of Distribution Algorithm (EDA) substitutes traditional reproduction operators by model-guided operators. Such models are learned from the population by employing machine learning techniques and represented as Probabilistic Graphical Models, from which new solutions can be sampled[48][49] or generated from guided-crossover.[50]
  • Gene expression programming (GEP) also uses populations of computer programs. These complex computer programs are encoded in simpler linear chromosomes of fixed length, which are afterwards expressed as expression trees. Expression trees or computer programs evolve because the chromosomes undergo mutation and recombination in a manner similar to the canonical GA. But thanks to the special organization of GEP chromosomes, these genetic modifications always result in valid computer programs.[51]
  • Genetic programming (GP) is a related technique popularized by John Koza in which computer programs, rather than function parameters, are optimized. Genetic programming often uses tree-based internal data structures to represent the computer programs for adaptation instead of the list structures typical of genetic algorithms.
  • Grouping genetic algorithm (GGA) is an evolution of the GA where the focus is shifted from individual items, like in classical GAs, to groups or subset of items.[52] The idea behind this GA evolution proposed by Emanuel Falkenauer is that solving some complex problems, a.k.a. clustering or partitioning problems where a set of items must be split into disjoint group of items in an optimal way, would better be achieved by making characteristics of the groups of items equivalent to genes. These kind of problems include bin packing, line balancing, clustering with respect to a distance measure, equal piles, etc., on which classic GAs proved to perform poorly. Making genes equivalent to groups implies chromosomes that are in general of variable length, and special genetic operators that manipulate whole groups of items. For bin packing in particular, a GGA hybridized with the Dominance Criterion of Martello and Toth, is arguably the best technique to date.
  • Interactive evolutionary algorithms are evolutionary algorithms that use human evaluation. They are usually applied to domains where it is hard to design a computational fitness function, for example, evolving images, music, artistic designs and forms to fit users' aesthetic preference.


Swarm intelligence

Swarm intelligence is a sub-field of evolutionary computing.

Swarm intelligence is a sub-field of evolutionary computing.

群体智能是进化计算的一个子领域。


  • Particle swarm optimization (PSO) is a computational method for multi-parameter optimization which also uses population-based approach. A population (swarm) of candidate solutions (particles) moves in the search space, and the movement of the particles is influenced both by their own best known position and swarm's global best known position. Like genetic algorithms, the PSO method depends on information sharing among population members. In some problems the PSO is often more computationally efficient than the GAs, especially in unconstrained problems with continuous variables.[54]


Other evolutionary computing algorithms

Evolutionary computation is a sub-field of the metaheuristic methods.

进化计算是元启发式方法的一个子领域。

Evolutionary computation is a sub-field of the metaheuristic methods.


  • Electimize algorithm is an evolutionary algorithm that simulates the phenomenon of electron flow and electrical conductivity. Some current research showed Electimize to be more efficient in solving NP-hard optimization problems than traditional evolutionary algorithms. The algorithm provides higher capacity in searching the solution space extensively, and identifying global optimal alternatives. Unlike other evolutionary algorithms, Electimize evaluates the quality of the values in the solution string independently.[55]
  • Memetic algorithm (MA), often called hybrid genetic algorithm among others, is a population-based method in which solutions are also subject to local improvement phases. The idea of memetic algorithms comes from memes, which unlike genes, can adapt themselves. In some problem areas they are shown to be more efficient than traditional evolutionary algorithms.
  • Bacteriologic algorithms (BA) inspired by evolutionary ecology and, more particularly, bacteriologic adaptation. Evolutionary ecology is the study of living organisms in the context of their environment, with the aim of discovering how they adapt. Its basic concept is that in a heterogeneous environment, there is not one individual that fits the whole environment. So, one needs to reason at the population level. It is also believed BAs could be successfully applied to complex positioning problems (antennas for cell phones, urban planning, and so on) or data mining.[56]
  • Cultural algorithm (CA) consists of the population component almost identical to that of the genetic algorithm and, in addition, a knowledge component called the belief space.
  • Gaussian adaptation (normal or natural adaptation, abbreviated NA to avoid confusion with GA) is intended for the maximisation of manufacturing yield of signal processing systems. It may also be used for ordinary parametric optimisation. It relies on a certain theorem valid for all regions of acceptability and all Gaussian distributions. The efficiency of NA relies on information theory and a certain theorem of efficiency. Its efficiency is defined as information divided by the work needed to get the information.[58] Because NA maximises mean fitness rather than the fitness of the individual, the landscape is smoothed such that valleys between peaks may disappear. Therefore it has a certain "ambition" to avoid local peaks in the fitness landscape. NA is also good at climbing sharp crests by adaptation of the moment matrix, because NA may maximise the disorder (average information) of the Gaussian simultaneously keeping the mean fitness constant.


Other metaheuristic methods

Metaheuristic methods broadly fall within stochastic optimisation methods.

元启发式方法广泛地属于随机优化方法。

Metaheuristic methods broadly fall within stochastic optimisation methods.


  • Simulated annealing (SA) is a related global optimization technique that traverses the search space by testing random mutations on an individual solution. A mutation that increases fitness is always accepted. A mutation that lowers fitness is accepted probabilistically based on the difference in fitness and a decreasing temperature parameter. In SA parlance, one speaks of seeking the lowest energy instead of the maximum fitness. SA can also be used within a standard GA algorithm by starting with a relatively high rate of mutation and decreasing it over time along a given schedule.
  • Tabu search (TS) is similar to simulated annealing in that both traverse the solution space by testing mutations of an individual solution. While simulated annealing generates only one mutated solution, tabu search generates many mutated solutions and moves to the solution with the lowest energy of those generated. In order to prevent cycling and encourage greater movement through the solution space, a tabu list is maintained of partial or complete solutions. It is forbidden to move to a solution that contains elements of the tabu list, which is updated as the solution traverses the solution space.
  • Extremal optimization (EO) Unlike GAs, which work with a population of candidate solutions, EO evolves a single solution and makes local modifications to the worst components. This requires that a suitable representation be selected which permits individual solution components to be assigned a quality measure ("fitness"). The governing principle behind this algorithm is that of emergent improvement through selectively removing low-quality components and replacing them with a randomly selected component. This is decidedly at odds with a GA that selects good solutions in an attempt to make better solutions.


Other stochastic optimisation methods

  • The cross-entropy (CE) method generates candidate solutions via a parameterized probability distribution. The parameters are updated via cross-entropy minimization, so as to generate better samples in the next iteration.
  • Reactive search optimization (RSO) advocates the integration of sub-symbolic machine learning techniques into search heuristics for solving complex optimization problems. The word reactive hints at a ready response to events during the search through an internal online feedback loop for the self-tuning of critical parameters. Methodologies of interest for Reactive Search include machine learning and statistics, in particular reinforcement learning, active or query learning, neural networks, and metaheuristics.


See also


References

  1. Eiben, A. E. et al (1994). "Genetic algorithms with multi-parent recombination". PPSN III: Proceedings of the International Conference on Evolutionary Computation. The Third Conference on Parallel Problem Solving from Nature: 78–87. .
  2. Ting, Chuan-Kang (2005). "On the Mean Convergence Time of Multi-parent Genetic Algorithms Without Selection". Advances in Artificial Life: 403–412. .
  3. Deb, Kalyanmoy; Spears, William M. (1997). "C6.2: Speciation methods". Handbook of Evolutionary Computation. Institute of Physics Publishing. 
  4. Shir, Ofer M. (2012). "Niching in Evolutionary Algorithms". In Rozenberg, Grzegorz (in en). Handbook of Natural Computing. Springer Berlin Heidelberg. pp. 1035–1069. doi:10.1007/978-3-540-92910-9_32. ISBN 9783540929093. 
  5. Harik, Georges R.; Lobo, Fernando G.; Sastry, Kumara (1 January 2006) (in en). Linkage Learning via Probabilistic Modeling in the Extended Compact Genetic Algorithm (ECGA). Studies in Computational Intelligence. 33. pp. 39–61. doi:10.1007/978-3-540-34954-9_3. ISBN 978-3-540-34953-2. 
  6. Pelikan, Martin; Goldberg, David E.; Cantú-Paz, Erick (1 January 1999). BOA: The Bayesian Optimization Algorithm. Gecco'99. pp. 525–532. ISBN 9781558606111. http://dl.acm.org/citation.cfm?id=2933973. 
  7. Coffin, David; Smith, Robert E. (1 January 2008) (in en). Linkage Learning in Estimation of Distribution Algorithms. Studies in Computational Intelligence. 157. pp. 141–156. doi:10.1007/978-3-540-85068-7_7. ISBN 978-3-540-85067-0. 
  8. Echegoyen, Carlos; Mendiburu, Alexander; Santana, Roberto; Lozano, Jose A. (8 November 2012). "On the Taxonomy of Optimization Problems Under Estimation of Distribution Algorithms". Evolutionary Computation. 21 (3): 471–495. doi:10.1162/EVCO_a_00095. ISSN 1063-6560. PMID 23136917. S2CID 26585053.
  9. Sadowski, Krzysztof L.; Bosman, Peter A.N.; Thierens, Dirk (1 January 2013). On the Usefulness of Linkage Processing for Solving MAX-SAT. Gecco '13. pp. 853–860. doi:10.1145/2463372.2463474. ISBN 9781450319638. 
  10. Taherdangkoo, Mohammad; Paziresh, Mahsa; Yazdi, Mehran; Bagheri, Mohammad Hadi (19 November 2012). "An efficient algorithm for function optimization: modified stem cells algorithm". Central European Journal of Engineering. 3 (1): 36–50. doi:10.2478/s13531-012-0047-8.
  11. Wolpert, D.H., Macready, W.G., 1995. No Free Lunch Theorems for Optimisation. Santa Fe Institute, SFI-TR-05-010, Santa Fe.
  12. Goldberg, David E. (1991). "The theory of virtual alphabets". Parallel Problem Solving from Nature. Lecture Notes in Computer Science. 496. pp. 13–22. doi:10.1007/BFb0029726. ISBN 978-3-540-54148-6. 
  13. Janikow, C. Z.; Michalewicz, Z. (1991). "An Experimental Comparison of Binary and Floating Point Representations in Genetic Algorithms" (PDF). Proceedings of the Fourth International Conference on Genetic Algorithms: 31–36. Retrieved 2 July 2013.
  14. Patrascu, M.; Stancu, A.F.; Pop, F. (2014). "HELGA: a heterogeneous encoding lifelike genetic algorithm for population evolution modeling and simulation". Soft Computing. 18 (12): 2565–2576. doi:10.1007/s00500-014-1401-y. S2CID 29821873.
  15. Baluja, Shumeet; Caruana, Rich (1995). Removing the genetics from the standard genetic algorithm (PDF). ICML.
  16. Srinivas, M.; Patnaik, L. (1994). "Adaptive probabilities of crossover and mutation in genetic algorithms" (PDF). IEEE Transactions on System, Man and Cybernetics. 24 (4): 656–667. doi:10.1109/21.286385.
  17. Zhang, J.; Chung, H.; Lo, W. L. (2007). "Clustering-Based Adaptive Crossover and Mutation Probabilities for Genetic Algorithms". IEEE Transactions on Evolutionary Computation. 11 (3): 326–335. doi:10.1109/TEVC.2006.880727. S2CID 2625150.
  18. See for instance Evolution-in-a-nutshell -{zh-cn:互联网档案馆; zh-tw:網際網路檔案館; zh-hk:互聯網檔案館;}-存檔,存档日期15 April 2016. or example in travelling salesman problem, in particular the use of an edge recombination operator.
  19. Goldberg, D. E.; Korb, B.; Deb, K. (1989). "Messy Genetic Algorithms : Motivation Analysis, and First Results". Complex Systems. 5 (3): 493–530.
  20. Gene expression: The missing link in evolutionary computation
  21. Harik, G. (1997). Learning linkage to efficiently solve problems of bounded difficulty using genetic algorithms (PhD). Dept. Computer Science, University of Michigan, Ann Arbour.
  22. Tomoiagă B, Chindriş M, Sumper A, Sudria-Andreu A, Villafafila-Robles R. Pareto Optimal Reconfiguration of Power Distribution Systems Using a Genetic Algorithm Based on NSGA-II. Energies. 2013; 6(3):1439-1455.
  23. Noetel M, Ciarrochi J, Sahdra B, Lonsdale C (2019). "Using genetic algorithms to abbreviate the Mindfulness Inventory for Sport: A substantive-methodological synthesis". Psychology of Sport and Exercise. 45 (101545): 101545. doi:10.1016/j.psychsport.2019.101545.
  24. Gross, Bill. "A solar energy system that tracks the sun". TED. Retrieved 20 November 2013.
  25. Hornby, G. S.; Linden, D. S.; Lohn, J. D., Automated Antenna Design with Evolutionary Algorithms (PDF)
  26. "Flexible Muscle-Based Locomotion for Bipedal Creatures".
  27. Evans, B.; Walton, S.P. (December 2017). "Aerodynamic optimisation of a hypersonic reentry vehicle based on solution of the Boltzmann–BGK equation and evolutionary optimisation". Applied Mathematical Modelling. 52: 215–240. doi:10.1016/j.apm.2017.07.024. ISSN 0307-904X.
  28. Skiena, Steven (2010). The Algorithm Design Manual (2nd ed.). Springer Science+Business Media. ISBN 978-1-849-96720-4. 
  29. Turing, Alan M. (October 1950). "Computing machinery and intelligence". Mind. LIX (238): 433–460. doi:10.1093/mind/LIX.236.433.
  30. Barricelli, Nils Aall (1954). "Esempi numerici di processi di evoluzione". Methodos: 45–68.
  31. Barricelli, Nils Aall (1957). "Symbiogenetic evolution processes realized by artificial methods". Methodos: 143–182.
  32. Fraser, Alex (1957). "Simulation of genetic systems by automatic digital computers. I. Introduction". Aust. J. Biol. Sci. 10 (4): 484–491. doi:10.1071/BI9570484.
  33. Fraser, Alex; Burnell, Donald (1970). Computer Models in Genetics. New York: McGraw-Hill. ISBN 978-0-07-021904-5. 
  34. Crosby, Jack L. (1973). Computer Simulation in Genetics. London: John Wiley & Sons. ISBN 978-0-471-18880-3. 
  35. 02.27.96 - UC Berkeley's Hans Bremermann, professor emeritus and pioneer in mathematical biology, has died at 69
  36. Fogel, David B. (editor) (1998). Evolutionary Computation: The Fossil Record. New York: IEEE Press. ISBN 978-0-7803-3481-6. 
  37. Barricelli, Nils Aall (1963). "Numerical testing of evolution theories. Part II. Preliminary tests of performance, symbiogenesis and terrestrial life". Acta Biotheoretica. 16 (3–4): 99–126. doi:10.1007/BF01556602. S2CID 86717105.
  38. Rechenberg, Ingo (1973). Evolutionsstrategie. Stuttgart: Holzmann-Froboog. ISBN 978-3-7728-0373-4. 
  39. Schwefel, Hans-Paul (1974). Numerische Optimierung von Computer-Modellen (PhD thesis). 
  40. Schwefel, Hans-Paul (1977). Numerische Optimierung von Computor-Modellen mittels der Evolutionsstrategie : mit einer vergleichenden Einführung in die Hill-Climbing- und Zufallsstrategie. Basel; Stuttgart: Birkhäuser. ISBN 978-3-7643-0876-6. 
  41. Schwefel, Hans-Paul (1981). Numerical optimization of computer models (Translation of 1977 Numerische Optimierung von Computor-Modellen mittels der Evolutionsstrategie. Chichester ; New York: Wiley. ISBN 978-0-471-09988-8. 
  42. Aldawoodi, Namir (2008). An Approach to Designing an Unmanned Helicopter Autopilot Using Genetic Algorithms and Simulated Annealing. pp. 99. ISBN 978-0549773498. https://books.google.com/books?id=-MszVdu_PAMC&q=general+electric+genetic+algorithm+mainframe. 
  43. Markoff, John (29 August 1990). "What's the Best Answer? It's Survival of the Fittest". New York Times. Retrieved 13 July 2016.
  44. Ruggiero, Murray A.. (1 August 2009) Fifteen years and counting -{zh-cn:互联网档案馆; zh-tw:網際網路檔案館; zh-hk:互聯網檔案館;}-存檔,存档日期30 January 2016.. Futuresmag.com. Retrieved on 2013-08-07.
  45. Evolver: Sophisticated Optimization for Spreadsheets. Palisade. Retrieved on 2013-08-07.
  46. Benchmarks for Evaluating Optimization Algorithms and Benchmarking MATLAB Derivative-Free Optimizers for Practitioners’Rapid Access, IEEE Access, vol.7, 2019.
  47. Cohoon, J (2002). Evolutionary algorithms for the physical design of VLSI circuits. Springer, pp. 683-712, 2003. ISBN 978-3-540-43330-9. https://www.ifte.de/mitarbeiter/lienig/cohoon.pdf. 
  48. Pelikan, Martin; Goldberg, David E.; Cantú-Paz, Erick (1 January 1999). BOA: The Bayesian Optimization Algorithm. Gecco'99. pp. 525–532. ISBN 9781558606111. http://dl.acm.org/citation.cfm?id=2933973. 
  49. Pelikan, Martin (2005). Hierarchical Bayesian optimization algorithm : toward a new generation of evolutionary algorithms (1st ed.). Berlin [u.a.]: Springer. ISBN 978-3-540-23774-7. 
  50. Thierens, Dirk (11 September 2010). "The Linkage Tree Genetic Algorithm" (in en). pp. 264–273. doi:10.1007/978-3-642-15844-5_27. ISBN 978-3-642-15843-8. 
  51. Ferreira, C. "Gene Expression Programming: A New Adaptive Algorithm for Solving Problems" (PDF). Complex Systems, Vol. 13, issue 2: 87-129.
  52. Falkenauer, Emanuel (1997). Genetic Algorithms and Grouping Problems. Chichester, England: John Wiley & Sons Ltd. ISBN 978-0-471-97150-4. 
  53. Zlochin, Mark; Birattari, Mauro; Meuleau, Nicolas; Dorigo, Marco (1 October 2004). "Model-Based Search for Combinatorial Optimization: A Critical Survey". Annals of Operations Research (in English). 131 (1–4): 373–395. CiteSeerX 10.1.1.3.427. doi:10.1023/B:ANOR.0000039526.52305.af. ISSN 0254-5330. S2CID 63137.
  54. Rania Hassan, Babak Cohanim, Olivier de Weck, Gerhard Vente r (2005) A comparison of particle swarm optimization and the genetic algorithm
  55. Khalafallah Ahmed; Abdel-Raheem Mohamed (1 May 2011). "Electimize: New Evolutionary Algorithm for Optimization with Application in Construction Engineering". Journal of Computing in Civil Engineering. 25 (3): 192–201. doi:10.1061/(ASCE)CP.1943-5487.0000080.
  56. Baudry, Benoit; Franck Fleurey; Jean-Marc Jézéquel; Yves Le Traon (March–April 2005). "Automatic Test Case Optimization: A Bacteriologic Algorithm" (PDF). IEEE Software. 22 (2): 76–82. doi:10.1109/MS.2005.30. S2CID 3559602. Retrieved 9 August 2009.
  57. Civicioglu, P. (2012). "Transforming Geocentric Cartesian Coordinates to Geodetic Coordinates by Using Differential Search Algorithm". Computers &Geosciences. 46: 229–247. Bibcode:2012CG.....46..229C. doi:10.1016/j.cageo.2011.12.011.
  58. Kjellström, G. (December 1991). "On the Efficiency of Gaussian Adaptation". Journal of Optimization Theory and Applications. 71 (3): 589–597. doi:10.1007/BF00941405. S2CID 116847975.


Bibliography

  • Bies, Robert R.; Muldoon, Matthew F.; Pollock, Bruce G.; Manuck, Steven; Smith, Gwenn; Sale, Mark E. (2006). "A Genetic Algorithm-Based, Hybrid Machine Learning Approach to Model Selection". Journal of Pharmacokinetics and Pharmacodynamics. 33 (2): 196–221. doi:10.1007/s10928-006-9004-6. PMID 16565924. S2CID 39571129. {{cite journal}}: Invalid |ref=harv (help)
  • Goldberg, David (1989). Genetic Algorithms in Search, Optimization and Machine Learning. Reading, MA: Addison-Wesley Professional. ISBN 978-0201157673. 
  • Goldberg, David (2002). The Design of Innovation: Lessons from and for Competent Genetic Algorithms. Norwell, MA: Kluwer Academic Publishers. ISBN 978-1402070983. 
  • Fogel, David (2006). Evolutionary Computation: Toward a New Philosophy of Machine Intelligence (3rd ed.). Piscataway, NJ: IEEE Press. ISBN 978-0471669517. 
  • Koza, John (1992). Genetic Programming: On the Programming of Computers by Means of Natural Selection. Cambridge, MA: MIT Press. ISBN 978-0262111706. 
  • Michalewicz, Zbigniew (1996). Genetic Algorithms + Data Structures = Evolution Programs. Springer-Verlag. ISBN 978-3540606765. 
  • Mitchell, Melanie (1996). An Introduction to Genetic Algorithms. Cambridge, MA: MIT Press. ISBN 9780585030944. 
  • Rechenberg, Ingo (1994): Evolutionsstrategie '94, Stuttgart: Fromman-Holzboog.
  • Schmitt, Lothar M. (2004). "Theory of Genetic Algorithms II: models for genetic operators over the string-tensor representation of populations and convergence to global optima for arbitrary fitness function under scaling". Theoretical Computer Science. 310 (1–3): 181–231. doi:10.1016/S0304-3975(03)00393-1. {{cite journal}}: Invalid |ref=harv (help)
  • Schwefel, Hans-Paul (1974): Numerische Optimierung von Computer-Modellen (PhD thesis). Reprinted by Birkhäuser (1977).
  • Hingston, Philip; Barone, Luigi; Michalewicz, Zbigniew (2008). Design by Evolution: Advances in Evolutionary Design. Springer. ISBN 978-3540741091. 
  • Eiben, Agoston; Smith, James (2003). Introduction to Evolutionary Computing. Springer. ISBN 978-3540401841. 


External links

Resources

  • [1] Provides a list of resources in the genetic algorithms field


Tutorials

Category:Evolutionary algorithms

类别: 进化算法

Category:Search algorithms

类别: 搜索算法

Category:Cybernetics

类别: 控制论

Category:Digital organisms

类别: 数字有机体

Category:Machine learning

分类: 机器学习


sv:Genetisk programmering#Genetisk algoritm

编程 # genedisk algoritm


This page was moved from wikipedia:en:Genetic algorithm. Its edit history can be viewed at 遗传算法/edithistory