A comprehensive fuzz‐rule‐based self‐adaptive genetic algorithm

Share Embed


Descripción

A COMPREHENSIVE FUZZ-RULE-BASED SELF-ADAPTIVE GENETIC ALGORITHM* XIAO-BING HU Department of Informatics, University of Sussex, Brighton, BN1 9QH, UK [email protected]. EZEQUIEL DI PAOLO Department of Informatics, University of Sussex, Brighton, BN1 9QH, UK [email protected] SHU-FAN WU European Space Research and Technology Centre, European Space Agency Keplerlaan 1, 2201 AG, Noordwijk ZH, The Netherlands [email protected] Abstract Purpose - This paper presents a comprehensive self-adaptive Genetic Algorithm (GA) based on fuzzy mechanism, aiming to improve both the optimizing capability and the convergence speed. Design/Methodology/Approach - Many key factors that affect the performance of GAs are identified and analyzed, and their influences on the optimizing capability and the convergence speed are further elaborated, which prove to be very difficult to be described with explicit mathematical formulas. Therefore, a set of fuzzy rules are used to model these complicated relationships, in order to effectively guide the online self-adaptive adjustments, such as changing the crossover and mutation probabilities, and thus to improve the optimizing capability and convergence speed. Findings - Simulation results illustrates that, compared with a normal GA and another self-adaptive GA based on explicit mathematical modeling of the key factors, the new GA is more advanced in terms of the optimizing capability and the convergence speed. Originality/value - This paper develops a fuzzy-rule-based approach to describe the relationships between multiple GA parameters and online states, and the approach is useful in the design of a comprehensive selfadaptive GA. Keywords Genetic algorithm; fuzzy rule; self-adaptation; optimization. Paper type Research paper

1. Introduction Genetic Algorithm (GA) is a large-scale parallel stochastic searching and optimizing method inspired by the biological mechanisms of evolution and heredity. In recent years, GA has been widely used for numerical optimization, combinatorial optimization, classifier systems and many other engineering problems ( Holland, 1975, Goldberg, 1989, Michalewicz, 1994 and Mitchell, 1996). Surprisingly enough, the idea to apply Darwinian principles to automated problem solving originates from the fifties of last century, long before the breakthrough of computers (Fogel, 1998). During the sixties of last century three different implementations of this idea have been developed at three different places. *

An earlier version of the paper was presented at CEC2007, 25-28 Sep 2007, Singapore. 1

2

Xiao-Bing Hu, Ezequiel Di Paolo, Shu-Fan Wu

In the USA, Holland introduced a genetic algorithm (Holland, 1975 and Goldberg, 1989), while Fogel called his method evolutionary programming (Fogel et al., 1966 and Fogel, 1995). In Germany Rechenberg and Schwefel invented evolution strategies (Rechenberg, 1973 and Schwefel, 1995). For about 15 years these areas developed separately; it is since the early nineties that they are envisioned as different representatives (dialects) of one technology, call evolutionary computing. It was also in the early nineties that a fourth stream following the general ideas has emerged- genetic programming (Koza, 1992 and Banzhaf, et al., 1998). The contemporary terminology denotes the whole field by evolutionary computing and considers evolutionary programming, evolution strategies, genetic algorithms, and genetic programming as sub-areas (Eiben and Schoenauer, 2002). This paper will focus on the genetic algorithms investigating how to use fuzzy mechanism to take into account those performance-related factors in order to deliver an efficient self-adaptive evolutionary computing algorithm. The underlying idea proposed in this paper can be easily applied to other sub-areas of evolutionary computing. The basic idea of GA is: given a population of chromosomes, the environmental pressure causes natural selection (survival of the fittest) and hereby the fitness of the population is growing. It is easy to see such a process as optimization. Given an objective function to be maximized, we can randomly create a set of candidate solutions (chromosomes) and use the objective function as an abstract fitness measure (the higher the better). Based on this fitness, some of the better chromosomes are chosen to seed the next generation by applying crossover and/or mutation. Crossover is applied to two selected chromosomes, the so-called parents, and results in one or two new chromosomes, the children. Mutation is applied to one chromosome and results in one new chromosome. Applying crossover and mutation leads to a set of new chromosomes, the offspring. Based on their fitness these offspring compete with old chromosomes for a place in the next generation. This process can be iterated until a solution is found or a previously set time limit is reached. Many components of such an evolutionary process are stochastic. According to Darwin, the emergence of new species, adapted to their environment, is a consequence of the interaction between the survival of the fittest mechanism and undirected variations. Variation operators must be stochastic, the choice on which pieces of information will be exchanged during crossover, as well as the changes in a chromosome during mutation, are random. On the other hand, selection operators can be either deterministic, or stochastic. In the latter case fitter chromosomes have a higher chance to be selected than less fit ones, but typically even the weak chromosomes have a chance to become a parent or to survive. Due to the stochastic nature of GAs, some common problems arise to the optimizing capability and the convergence speed. GAs can not always guarantee finding the global optimum, even for a simplest problem, let alone complex systems. On the other hand, GAs are normally very time consuming. Particularly, when applying to a complex problem, GAs need to evolve many generations before the chromosomes can converge to some sub-optimal solutions, i.e., the convergence speed could be very slow. How to improve the optimizing capability and how to speed up the convergence of chromosomes are two important issues in GAs that many research works attempt to attack. For example,

A Comprehensive Fuzzy-Rule-Based Self-Adaptive GA

3

neural network, fuzzy theory, clustering techniques, problem-specific heuristic rules and self-adaptive strategies are often used in order to improve the performance of GAs (Xu and Liu, 1999, Chao et al., 1999, Hu et al., 2004, and Eiben and Smith, 2003). Among these general methods to improve the optimizing capability and convergence speed of GAs, self-adaptation of algorithm parameters is probably a most widely used method, which was actually introduced as early as evolution strategies were invented. In evolution strategies, the step sizes of mutation were encoded in chromosomes and evolved during a run of the algorithm (Rechenberg, 1973 and Eiben and Smith, 2003). The same idea was also adopted in evolutionary programming (Fogel et al., 1966 and Eiben and Smith, 2003). There are quite a few algorithm parameters can be controlled by self-adaptation, such as crossover and mutation probabilities (Chao et al, 1999 and Zhang et al., 2007), mutation step size (Rechenberg, 1973), selection rate (Baker, 1987), population size (Michalewicz, 1994 and Arabas et al., 1994), and fitness function coefficients (Zhang and Muhlenbeim, 1995). Self-adapting several algorithm parameters simultaneously was also investigated by some researchers, e.g., see Hinterding et al. (1996), where mutation rate and population size were both under self-adaptive control. The benefits from self-adaptation of algorithm parameters were even backed up by the theoretical results in Beyer (2001). As is well known, there are many parameters which affect the optimizing capability and convergence speed of GAs, and it is not surprising that each of these parameters should be, if possible, under self-adaptive control according to many different online states in the running of a GA, such as current fitness level of chromosomes and evolutionary trajectory. However, most existing methods just isolate and analyze the influence of a certain single online state on a specific algorithm parameter, and then make some modifications to that parameter accordingly. For example, in the self-adaptive GA reported in Chao et al. (1999), only the information of current fitness level of chromosomes are used to direct the online adjustment of the crossover and mutation probabilities. One reason for focusing on a single online state and a single parameter is to simplify the analyzing and modifying processes. With only current fitness level of chromosomes under consideration, Chao et al. (1999) easily and precisely describes the way for online probability adjustment by using some explicit mathematical formulas. This paper attempts to develop a comprehensive fuzzy-rule-based self-adaptive GA which will take into account many key performance-affecting factors. The motivations of using fuzzy mechanism to combine the influences of not just a single factor, but many at the same time, for designing GAs are: (1) The performance of GAs is related to many factors (parameters and online states which affect the self-adaptation of parameters), and factors work together when affecting the performance of GAs; (2) The interactions and relationships between these factors and the performance of GAs are very complicated; (3) Even if we only consider a single factor, its influence on the performance of GAs is still difficult to be described in an explicit and precise way. Although Chao et al. (1999) uses explicit mathematical formulas to precisely describe the relationship between the fitness level of current chromosomes and the crossover and mutation probabilities, those formulas are actually based on experience and experimental data; (4) Using fuzzy mechanism is an easy and ideal way to describe such complex interactions and

4

Xiao-Bing Hu, Ezequiel Di Paolo, Shu-Fan Wu

relationships, particularly when the interactions and relationships are observed from experience and experimental data. Actually, fuzzy rules have already been used in the design of self-adaptive GAs. For instance, Zhang et al. (2007) employs fuzzy rules to self-adapt the crossover and mutation probabilities, and Hu and Wu (2007) reports a case study on the self-adaptation of multiple parameters based on fuzzy mechanism. This paper will extend further the basic idea reported by Hu and Wu (2007) in order to develop a general methodology, and explain in more depth with a test case how to design the new GA and what are the corresponding advantages. 2. Key Factors Affecting Performance of GAs There are many factors which affect the performance of GAs, such as the structure of chromosomes, the choice of fitness function, the definitions of crossover and mutation operators, the population of chromosomes in a generation (NC), the total generations in the evolutionary process (NG), the crossover probability (PC) and the mutation probability (PM), and some other evolutionary-operation-related parameters. How to define and choose the structure of chromosomes, the fitness function and the evolutionary operators is more related to the GA modelling of the concerned problem, and they are basically problem-dependent. The quality of GA modelling obviously plays a crucial role in the performance of GAs. Once the GA modelling is completed, the structure of chromosomes, the form of fitness function, and procedures of crossover and mutation are fixed. The population of chromosomes in a generation, NC, and the total generations in the evolutionary process, NE, normally depend on the size of solution space, and they have obvious influences on the solution quality and the computational load of GAs. In general, choosing larger NC and NE will lead to higher solution quality, but at the cost of a large amount of computational time consumed by GAs. Usually, NC and NE are determined and set as constants before the GA is implemented to solve the concerned problem. Therefore, these factors are seldom used to design a self-adaptive GA. Only some evolutionary-operation-related parameters, such as crossover probability PC and mutation probability PM, can be easily adjusted online according to the current situation of optimization in a self-adaptive GA. 2.1. Crossover and mutation probabilities As is well known, crossover and mutation play crucial roles in diversifying chromosomes, and probabilities PC and PM determine how often crossover and mutation happen to chromosomes during the evolutionary process. On one hand, larger PC and PM can enable GAs to search the solution space more effectively, and therefore increase the potential of finding better solutions and even shooting the global optimums. However, on the other hand, larger PC and PM will also increase the risk of damaging and destroying good chromosomes, which could slow down and even prevent the solutions of GAs from converging. Many research works show that, during the optimizing process of GAs, online adjusting PC and PM according to the current situation of optimization can

A Comprehensive Fuzzy-Rule-Based Self-Adaptive GA

5

significantly improve the solution quality and increase the convergence speed (Chao et al., 1999 and Hu et al., 2004). The question is: according to what, and how to adjust. Chao et al. (1999) chooses the current fitness level of chromosomes to direct the online adjustments of PC and PM, and the influence of current fitness level of chromosomes on PC and PM is described by some explicit and precise mathematical formulas, such that the proposed GA has self-adaptive features. The current fitness level of chromosomes can be represented by the average fitness of current generation, favg, and the maximum and minimum fitness by current generation, fmax and fmin, which affect PC and PM in two ways. Firstly, the value of (fmax-favg)/( fmax-fmin) gives information of how the chromosomes of current generation are distributed in the solution space. If the value is larger, this means the chromosomes are more evenly distributed. Accordingly, a larger PC could help to converge. In the opposite case, the chromosomes could have converged to some local optimums, and therefore a larger PM is needed to diversify chromosomes. Secondly, suppose f is the fitness of a certain chromosome, then the value of (fmax-f)/( fmax-fmin) implies the quality of this chromosome comparing with the generation. If the value is smaller, then this chromosome is excellent, and consequently smaller PC and PM should be applied to this chromosome in order to protect it. Otherwise, large PC and PM can be chosen to change it dramatically. However, besides the current fitness level of chromosomes, there are many other factors which can be used to direct the online adjustment of PC and PM. For example, how many generations have been evolved (this is indicated by the serial number of current generation, NCG), and for how many generations the maximum fitness has been kept unchanged (this is denoted as NGFUC). The value of NCG/NG roughly indicates which stage the optimization of GA has arrived, i.e., an early stage or a late stage in the optimization. If NCG/NG is close to zero, this means the optimization of GA just begins, and therefore, relatively larger PC and PM should be applied in order to more widely search the solution space; while if NCG/NG is near 1, this means the optimization of GA is going to complete, so, smaller PC and PM are preferable in this situation for effectively protecting and slightly modifying good chromosomes. The value of NGFUC can be used to assess the optimization situation of the latest few generations. A smaller NGFUC means that progress is just made and some new good chromosomes are just found, so, smaller PC and PM should be applied in order to gently modifying those good chromosomes to check whether an optimum solution is nearby them. If the maximum fitness is not updated for many generations, i.e., NGFUC is large, this implies the optimization of GA has made little progress by searching around current chromosomes, and therefore PC and PM need to be increased to diversify chromosomes. 2.2. The range for evolutionary operations Normally, it is easy for GAs to find some solutions of acceptable quality level, but difficult to precisely shoot those optimums, even local optimums. It often happens that GAs end up with wandering very closely around the optimums. To help GAs to climb upon the optimums, in most implementations of GAs, problem-depend heuristic rules are

6

Xiao-Bing Hu, Ezequiel Di Paolo, Shu-Fan Wu

used. The methods used in Wu et al. (1999) and Xu and Liu (1999) are more general, where they combine some external technologies such as neural network. Here an internal factor is identified which plays an important role in the searching power of GAs: the range for evolutionary operations, in other words, which parts of a chromosome crossover and mutation are allowed to apply to. For the sake of simplicity, suppose there is only one variable of real number, and a chromosome is the binary representation of the variable. Clearly, if changes happen to the first few genes in a chromosome, the resulted new value will be far away from the original value; while if to the last few genes, then the new value and the old one will be close to each other. Normally, evolutionary operations are allowed to apply to any parts of any chromosomes (except a few specified best chromosomes under full protection) at any time. However, here an extra system parameter is introduced to limit the range for evolutionary operations: LRange, which means only the last LRange genes in a chromosome are exposed to the evolutionary operations. LRange will be online adjusted according to the current optimization situation of GAs and the quality of each individual chromosome, and it varies between 0 and LC, the length of chromosomes. Similar the analysis on PC and PM, the following are some experience based knowledge on how LRange should be online adjusted. The current fitness level of chromosomes also affects LRange in two ways. Firstly, if the value of (fmax-favg)/( fmax-fmin) is small, which means the chromosomes are close to some optimums, then a small LRange will help to find those optimums. Secondly, if the value of (fmax-f)/( fmax-fmin) is small, which implies that this chromosome is likely near an optimum, then a small LRange is preferable; otherwise, a large LRange should be applied to this chromosome. If NCG/NG is near 0, set LRange=LC in order to diversify chromosomes. If NCG/NG is close to 1, which indicates the optimization is in the final stage, then a small LRange is preferable in order to gently modify current chromosomes to better shoot optimums. When NGFUC is large, which means the searching strategy under current LRange has not made progress, then LRange should be changed accordingly in order to diversify chromosomes or gently modify them. When LRange is medium, if NGFUC is small, then a large LRange is preferable; while if NGFUC is large, then a small LRange will be better. Actually, above analyses mainly focus on the influences of many factors on PC, PM and LRange. It is PC, PM and LRange which finally determines the optimizing capability and convergence speed of GAs. Basically, small PC, PM and LRange will help to protect and improve current chromosomes, and large ones will help to diversify chromosomes. The influence of PC, PM and LRange on the searching power of GAs is illustrated by Figure 1 in a more intuitive way. Table I summarizes above analyses. From above analyses, clearly, the influences and relationships between those factors and the searching power of GAs are very complex, and therefore are very difficult, if not impossible, to describe by using precise mathematical formulas. While on the other side, embedded with artificial intelligence (AI), the fuzzy mechanism provides a great potentials in describing the above experience based knowledge, which is the major topic of this paper as to be elaborated hereafter.

A Comprehensive Fuzzy-Rule-Based Self-Adaptive GA

7

Fig. 1. Influence of PC, PM, and LRange on the searching power of GAs. Table I. Influences of key factors on PC, PM, and LRange. (fmax-favg)/( fmax-fmin) PC PM LRange

(fmax-f)/( fmax-fmin)

NCG/NG

NGFUC

L

S

L

S

L

S

L

S

L S L

S L S

L L L

S S S

S S S

L L L

L L S

S S L

3. Methodology of Comprehensive Fuzzy-Rule-Based Self-Adaptive GA This section describes a general methodology of design a comprehensive fuzzy-rulebased self-adaptive GA. Figure 2 gives the flowchart of this methodology. Obviously, the first step is to identify those parameters which are suitable for selfadaptation, as well as those online states whose changes will trigger the self-adaptation of the parameters. This work has been done in Section 2, where PC, PM and LRange are selfadaptive parameters, and (fmax-favg)/(fmax-fmin), (fmax-f)/( fmax-fmin), NCG/NG and NGFUC are key online states which affect the parameters. However, for the sake of generality, it is assumed there are nP self-adaptive parameters and nS online states. Secondly, we need to decide how many fuzzy values should be assigned to each selfadaptive parameter as well as each online state. Suppose each parameter has nFYP fuzzy values, and each state has nFYS fuzzy values. Then the total number of fuzzy values for the parameters and the total number of fuzzy values for the states are nTFVP  nP  nFVP , nTFVS  nS  nFVS (1) respectively. Originally, each online state has a numerical value. To map such a numerical value of a state to an appropriate fuzzy value, we need to define a membership function for that state. It should be note that a same numerical value for online state j may correspond to the f(i1)th fuzzy value for parameter i1, and to the f(i2)th fuzzy value for parameter i2. For instance, when PC or PM is to considered, NGFUC=8 is given a fuzzy value ‘medium’, while if LRange needs to be calculated, NGFUC=8 may be given a fuzzy value ‘small’. In other words, each online state may have nP membership functions, which associate with

8

Xiao-Bing Hu, Ezequiel Di Paolo, Shu-Fan Wu

the nP self-adaptive parameters. Therefore, in total there should be nS×nP membership functions.

Fig. 2. Methodology to design and implement new self-adaptive GA.

Since only numerical values are used by the parameters in the implementation of GAs, we need to find a way to convert a fuzzy value of a parameter into a numerical value. Usually, this can be done by defining a fuzzy-value-to-numerical-value table (FV2NVT). Then a fuzzy-rule table (FRT), i.e., a table of fuzzy relationships between these selfadaptive parameters and online states, needs to be established, just like Table I. Each of the nTFYP×nTFYS cells at the right-bottom of the FRT determines a fuzzy rule. For instance, the fuzzy rule associated with the last cell in Table I is: if NGFUC is ‘small’, then LRange needs to be set ‘large’. From the FRT, one can see that each self-adaptive parameter is affected by many different online states. In other words, for each parameter, we may get a totally different

A Comprehensive Fuzzy-Rule-Based Self-Adaptive GA

9

fuzzy value according to different states. Then how to integrate these different fuzzy values into a final numerical value for the parameter? Basically, there are two methods: gravity-centre-rule (GCR) and maximum-value-rule (MVR). Suppose we need to calculate vNP(i), the final numerical value for self-adaptive parameter i, i=1,…, nP. For a given numerical value vNS(j) for online state j, j=1,…, nS, let pS(f,j,i) be the probability, i.e. the likelihood, that state j has the fth fuzzy value, f=1,…,nFYS, according to its membership function associated with parameter i. With pS(f,j,i) and FRT such as Table I, one can easily get pP(f,i,j), the probability that parameter i has the fth fuzzy value according to state j, e.g., corresponding to the fuzzy rule of “If state j has the fSth fuzzy value, then parameter i has the fPth fuzzy value”, one has (2) pP ( f P , i, j )  pS ( fS , j, i) . Therefore, the probability that parameter i has the fth fuzzy value is

pP ( f , i ) 

1 nS

nS

p i 1

P

( f , i, j ) .

(3)

Suppose the FV2NVT is given, and vNP(i,f) is the numerical value corresponding to its fth fuzzy value. Then, under the GCR, nFVP

v NP (i)  min(1,

v f 1

NP

(i, f )  p P ( f , i) ),

nFVP

v f 1

NP

(4)

(i, f )

while under the MVR,

v NP (i)  v NP (i, f M )

(5)

if pP(fM,i)≥pP(f,i) for any f=1,…,nFYP. 4. An Example of Design Now we are going to use the methodology proposed in Section 3 to design a comprehensive fuzzy-rule-based self-adaptive GA to resolve the following optimization problem: (6) max y( x)  max sin(30 x) (1 | x | / 2) . x[ 256, 256]

x[ 256, 256]

There are many local optimums to the above problem which could easily confuse normal optimizing algorithms. Only two global optimums exist, i.e., x=±0.05179, and the associated maximum y(x) is 0.9739626. Basically, the self-adaptive parameters and online states identified in Section 2 will be used in this design, but with two minor modifications. First, the crossover probability for each individual chromosome is determined by two parameters: PC  GC  K C (7)

10

Xiao-Bing Hu, Ezequiel Di Paolo, Shu-Fan Wu

where GC is the crossover probability for the current generation, and KC is an adjustment coefficient depending on each individual chromosome. Similarly, we have (8) PM  GM  K M for mutation. The another modification is that we use a real number 0≤lRange≤1 rather then the integer LRange to indicate the range for evolutionary operations, and (9) LRange  round (l Range  LC ) where “round” is a function to round a real number to the nearest integer. For the sake of simplicity, the fuzzy value set of [S,M,L] is adopted for all parameters and states. Based on the new fuzzy value set and the modified parameters and states, we extend Table I into Table II, where ‘--’ means no rule is defined, in other words, some states have no influence on certain parameters. Table II. FRT Especially for problem (6). (fmax-favg)/( fmax-fmin) (fmax-f)/( fmax-fmin)

NCG/NG

NGFUC

L

M

S

L

M

S

L

M

S L

M

S

GC GM KC KM

L S ---

M M ---

S L ---

--L L

--M M

--S S

S S ---

M L L M L L -- -- --- -- --

M M ---

S S ---

LRange

L

M

S

L

M

S

S

M

M

L

L

S

Since state (fmax-favg)/(fmax-fmin) can affect 3 parameters, i.e., GC, GM and lRange, we need to design 3 membership functions for it, as shown in Figure 3. Similarly, we design 3 membership functions for each of (fmax-f)/(fmax-fmin), NCG/NG, and NGFUC, as given in Figure 4 to Figure 6 respectively. From Figure 3 to Figure 6, one can see that some states have a same membership function for all parameters they affect, while some states use different ones. For example, state (fmax-favg)/(fmax-fmin) uses different membership functions for different parameters, while state (fmax-f)/(fmax-fmin) uses the same one. One may also notice something unusual in these membership functions. For example, in Figure 3.(c), when the likelihood of “L” is 1, the likelihoods of “M” and “S” are not 0. This is against what classical fuzzy theory requires at first glance, but the “min” function in Equation (4) can avoid any potential problems when using these unusual membership functions. The point here is the design of membership functions relies on the concerned problem and experience. For instance, the membership functions given in Figure 3 to Figure 6 are especially designed for the optimization problem (6) to achieve the best performance.

A Comprehensive Fuzzy-Rule-Based Self-Adaptive GA

11

Fig. 3. Membership functions of (fmax-favg)/(fmax-fmin).

Fig. 4. Membership functions of (fmax-f)/(fmax-fmin).

Fig. 5. Membership functions of NCG/NG.

Fig. 6. Membership functions of NGFUC.

The FV2NVT in this example is defined as Table III, and then the crossover and mutation probabilities are calculated according to the GCR, while lRange to the MVR. Since some online states may have no influence on certain self-adaptive parameters, in this case, Equation (2) needs to be modified as

 p ( f , j, i), if a rule defined . p P ( f P , i, j )   S S 0, if no rule defined 

(10)

Table III. Fuzzy output variables in new GA. GC Fuzzy value Numerical value

L 1

M 0.7

GM S 0.4

L 0.8

M 0.4

KC S 0.1

L 1

M 0.5

KM S 0.1

L 1

M 0.5

lRange S 0.1

L 1

M 0.8

S 0.4

12

Xiao-Bing Hu, Ezequiel Di Paolo, Shu-Fan Wu

By using fuzzy mechanism to combine the influences of many relevant online states to self-adjust GC, GM, KC, KM, and lRange, the optimizing capability and convergence speed can be significantly improved, as will be proved in the following simulation section. Figure 7 gives the flow chart of this new self-adaptive GA.

Fig. 7. Flowchart of new self-adaptive GA.

A Comprehensive Fuzzy-Rule-Based Self-Adaptive GA

13

5. Simulation Results To better assess the optimizing capability and convergence speed of the self-adaptive GA proposed in this paper, another two GAs are designed for comparative purposes. One is a normal GA with fixed PC=0.6 and PM =0.1, denoted as GA1, the other is the self-adaptive GA reported by Chao et al. (1999), denoted as GA2. For distinguishing purposes, the new GA in this paper will be denoted as GA3. For all three GAs, the variable x is represented by chromosomes of 22-bit-long, where the first bit determines the sign of x, the following 8 bits represent the integral part of x, and the rest 13 bits encode the fractional part of x. The population of a generation of chromosomes is NC=50, and the total generations in the evolutionary process is NG=200. Each GA runs 100 times to solve the optimization problem (6), and results are given in Table IV, Table V and Figure 8. Table IV. Performance of three GAs. GA1 How many generations needed to shoot global optimums (average) How many times end up at local optimums

≥146 66

GA2 ≥97 15

GA3 =71 0

Table V. Analysis of changes of fmax (suppose fmax totally changes 100 times). How many times: fmax changes in 5 generations fmax changes in 5~15 generations fmax changes in >15 generations How many generations per change of fmax

GA1 43 35 22 10

GA2

GA3

52 30 18 7

68 28 4 4

Fig. 8. Solution quality in optimization process of GAs.

From Table IV, one can see that GA1 has the worst performance, which is understandable. GA1 applies fixed PC and PM to all chromosomes regardless their individual qualities, and evolutionary operations can always be carried out in any part of a chromosome, i.e., LRange=23. Therefore, good chromosomes in GA1 are very likely to

14

Xiao-Bing Hu, Ezequiel Di Paolo, Shu-Fan Wu

be destroyed and have little chance to get improved. As a result, GA1 often ends up at local optimums when 200 generations run out (this is why there is a sign of “  ”). GA2 achieves relative better performance due to its self-adaptive features. However, only considering the influence of fitness level on PC and PM is not enough. Some times, GA2 still ends up at local optimums. Clearly, by using fuzzy mechanism to combine the influences of many relevant factors on PC, PM and LRange, GA3 has the best optimizing capability and the fastest convergence speed. On average, GA3 shoots the global optimums after 71 evolutionary generations, and it never stops at a local optimum in all 100 runs. Table V gives more details about the optimizing capability and convergence speeds of three GAs. To make it easy to compare and understand, suppose the maximum fitness fmax have changed totally 100 times during the evolutionary process of each GA. Under GA3, 68 changes of fmax happen within 5 successive generations, 28 changes of fmax in 5~15 successive generations, only 4 changes require more than 15 successive generations, and, averagely, each change of fmax requires just 4 generations’ evolution. Clearly, GA3 proves to be the most efficient algorithm. Figure 8 gives an example about how solution quality (fitness accordingly) changes during the optimization process of three GAs. From Figure 8, one can see more clearly that, compared with GA1 and GA2, the optimizing capability and convergence speed of GA3 are largely improved. 6. Conclusions The optimizing capability and the convergence speed of Genetic Algorithms (GAs) are affected by many factors, and the relationships between them are complex and difficult to describe precisely. Unlike other papers which, by using explicit mathematical formulas, only study the influence of a certain single factor on the performance of GAs, this paper reports a comprehensive fuzzy-rule-based method to take many factors into account when designing the GA. First, the key factors affecting the optimizing capability and the convergence speed of GAs are identified and analyzed, and then their influences are combined under fuzzy mechanism to effectively direct the online self-adaptive adjustments, such as modifying the crossover and mutation probabilities. The simulation results prove that the convergence is greatly speeded up and the optimizing capability is also improved. Future work may be conducted to analyze more GA parameters and online states, to develop a toolbox for automatically generating and adjusting fuzzy rules and membership functions, to test on a wide range of benchmark problems, and then to make some real-world applications. 7. Acknowledgements This work was partially supported by the Engineering and Physical Sciences Research Council (EPSRC) Grant EP/C51632X/1.

A Comprehensive Fuzzy-Rule-Based Self-Adaptive GA

15

References Arabas,J. Michalewicz, Z. and Mulawka, J. (1994). GAVaPS-a genetic algorithm with varying population size. Proceedings of the 1st IEEE Conference on Evolutionary Computation, IEEE Press, Piscataway, NJ. Baker, J. E. (1987). Reducing bias and inefficiency in the selection algorithm. Proceedings of the 2nd International Coference on Genetic Algorithms and Their Applications. Lawrence Erlbaum, Hillsdale, New Jersey. Banzhaf, W., Nordin, P., Keller, R. E. and Francone, F. D. (1998). Genetic Programming-An Introduction, Morgan Kaufmann, San Francisco, CA. Beyer, H. G. (2001). The Theory of Evolution Strategies. Springer, Berlin, Heidelberg, New York. Chao, X. L., Zheng, Z., Fan, N. and Wang, X. F. (1999). A modified genetic algorithm by integrating neural network technology. Pattern Recognition and Artificial Intelligence, 12: 486-492. Eiben, A. E. and Schoenauer, M. (2002). “Evolutionary computing,” Information Processing Letters, 82: 1-6. Eiben, A. E. and Smith, J. E. (2003). Introduction to Evolutionary Computing, Berlin, Germany: Springer-Verlag. Fogel, D. B. (1995). Evolutionary Computation: Toward a New Philosophy of Machine Intelligence, IEEE Press, Piscataway, NJ. Fogel, D. B. (1998) Evolutionary Computing: The Fossile Record, IEEE Press, Piscataway, NJ. Fogel, L. J. Owens, A. J. and Walsh, M. J. (1966). Artificial Intelligence through Simulated Evolution, John Wiley, New York. Goldberg, E. (1989). Genetic Algorithms in Search Optimization & Machine Learning. Reading, MA: Addison-Wesley. Hinterding, R., Michalewicz, Z. and Peachey, T. C. (1996). Self-adaptive genetic algorithm for numeric functions. Proceedings of the 4th Conference on Parallel Problem Solving from Nature, No. 1141 in Lecture Notes in Computer Science. pp. 420-429, Springer, Berlin, Heidelberg, New York. Holland, J. H. (1975). Adaptation in Natural and Artificial Systems, University of Michigan Press, Ann Arbor, MI. Hu,X.B. and Wu, S.F. (2007). Self-Adaptive Genetic Algorithm Based on Fuzzy Mechanism. The 2007 IEEE Congress on Evolutionary Computation (CEC2007), 25-28 Sep 2007, Singapore. Hu, X. B., Wu, S. F. and Jiang, J. (2004). On-Line Free-Flight Path Optimization Based on Improved Genetic Algorithms. Engineering Applications of Artificial Intelligence, 17: 897-907. Koza, J. R. (1992). Genetic Programming: On the Programming of Computers by Means of Natural Evolution, MIT Press, Cambridge, MA. Michalewicz, Z. (1994). Genetic Algorithms + Data Structures = Evolution Programs, Berlin, Germany: Springer-Verlag. Mitchell, M. (1996). An Introduction to Genetic Algorithms. Cambridge, MA: MIT Press. Rechenberg, I. (1973). Evolutionstrategie: Optimierung Technisher Systeme nach Prinzipien des Biologischen Evolution, Fromman-Hozlboog Verlag,Stuttgart. Schwefel, H. P. (1995). Numerical Optimization of Computer Models, John Wiley & Sons, New York, 1981; 2nd ed. Wu, Z. Y., Shao, H. H. and Wu, X. Y. (1999). A new self-adaptive genetic algorithm and its application. Control Theory and Applications, 16: 127-129. Xu, J. W. and Liu, J. W. (1999). Genetic algorithms based on sub bio-environment technology. Pattern Recognition and Artificial Intelligence, 12: 104-107. Zhang, B. and Muhlenbeim, H. (1995). Balancing accurancy and parsimony in genetic programming. Evolutionary Computing, 3: 17-38.

16

Xiao-Bing Hu, Ezequiel Di Paolo, Shu-Fan Wu

Zhang, J., Chung, H.S.H and Lo, W.L. (2007). Clustering-Based Adaptive Crossover and Mutation Probabilities for Genetic Algorithms. IEEE Transaction on Evolutionary Computation, 11: 326-335.

Xiao-Bing Hu received the B.S. degree in aviation electronic engineering from the Civil Aviation Institute of China, Tianjin, China, in 1998, the M.S. degree in automatic control engineering from Nanjing University of Aeronautics and Astronautics, Nanjing, China, in 2001, and the Ph.D degree in aeronautical and automotive engineering from Loughborough University, Leicestershire, U.K., in 2005. He is currently a Research Fellow with the Department of Informatics, University of Sussex, Brighton, U.K. His major field of research includes predictive control, artificial intelligence, air traffic management, and flight control. Ezequiel Di Paolo received the M.S. degree in nuclear engineering from the Institute Balseiro, Bariloche, Argentina, and the Ph.D degree from University of Sussex, Brighton, U.K. He is currently a Senior Lecturer with the Evolutionary and Adaptive Systems Group, University of Sussex. He is also with the Centre for Research in Cognitive Science, University of Sussex. His research interests include adaptive behavior in natural and artificial systems, biological modeling, evolutionary robotics, and enactive cognitive science Shu-Fan Wu. received his MSc and Ph.D. degrees from Department of Automatic Control at Nanjing University of Aeronautics & Astronautics, China. His research interests cover many aspects in systems and control engineering, particularly flight control in both aircraft and spacecraft. He currently holds a senior position in the Directorate of Technology and Quality Management, European Space Research and Technology Centre (ESTEC/TEC-ECN), the Netherlands.

Lihat lebih banyak...

Comentarios

Copyright © 2017 DATOSPDF Inc.