An experimental comparison of multiobjective algorithms: Nsga-ii and omopso

Share Embed


Descripción

An Experimental Comparison of MultiObjective Algorithms: NSGA-II and OMOPSO Adriana Cort´es God´ınez Instituto Tecnolgico de Le´on [email protected]

Luis Ernesto Mancilla Espinosa Instituto Tecnolgico de Le´on [email protected]

Efr´en Mezura Montes Laboratorio Nacional de Inform´atica Avanzada (LANIA) [email protected]

Abstract The optimization of multiobjective problems is currently an area of important research and development. The importance attained by this type of problems has allowed the development of multiple metaheuristics for solving them. To determine which multiobjective metaheuristic has the best performance with respect to a problem, in this article an experimental comparison between two of them: Sorting Genetic Algorithm Nodominated-II (NSGA-II) and MultiObjective Particle Swarm Optimization (MOPS) using ZDT test functions. The results obtained by both algorithms are compared and analyzed based on different performance metrics that evaluate both the dispersion of the solutions on the Pareto front, and its proximity to it.

1. Introduction The optimization seeks to minimize or maximize the value of a function in a given search space. In some areas such as engineering or economics, often have problems that require simultaneous optimization of two or more functions, in these cases, there is talk of multi-objective optimization problems. In this type of problems, different objectives are generally in conflict, i.e. to improve the performance of an objective means impair the performance of others. Since there are conflicting objectives, there is no single solution but a set of solutions that represent the best compromise between the different objectives. Evolutionary Algorithms have become popular as robust and effective methods for solving optimization problems. In recent decades there has developed a wide range of Evolutionary Algorithms for Multi-objective Optimization

(MOEA’s), which are classified in two generations. A first generation correspond those algorithms that do not include mechanisms for the preservation of good solutions (elitism): VEGA [13], MOGA [7], NPGA [9], NSGA [5]. The second generation algorithms are characterized by the incorporation of elitism: SPEA [20], PAES [10], SPEA2 [19] y NSGA-II [4]. However, the evolutionary algorithms are not the only techniques that have been used to solve multi-objective optimization problems. In recent years, several metaheuristics have been developed to solve such problems as Ant Colony Optimization (MOACO) [8], Artificial Immune System [3], Particle Swarm Optimization (MOPSO) [6]. This paper presents a comparative study between algorithms NSGA-II [4] and OMOPSO [12]. To perform the comparison, we use test functions ZDT [18], considered 5 performance metrics that measure the dispersion of solutions and proximity to the Pareto front: Error Ratio [16], Spacing [14], Generational Distance [15], Two Set Coverage [18] and Two Set Difference Hypervolume [17].

2. Basic Concepts A multi-objective optimization problem is mathematically defined as [1]: Find the vector ~x∗ = [x1 ∗ , x2 ∗ , ..., xn ∗ ]T which will satisfy the m inequality constraints and the p equality constraints gi (x) ≥ 0, i = {1, 2...m}, (1) hj (x) = 0, j = {1, 2...p} and optimize the vector function f~(~x) = [f1 (~x), f2 (~x), ..., fk (~x)]T

(2)

where ~x = [x1 , x2 , ..., xn ]T is the vector of decision variables. In multi-objective optimization algorithms, two solu-

tions are compared with each other to see if a solution dominates the other or not. It defines the concept of dominance as follows: A solution x1 is said to dominate the other solution x2 , if both conditions 1 and 2 are true:

4. The best N individuals from Rt are selected using the crowding tournament selection operator and from the parent population of the next generation P t + 1. 5. The steps 1-4 are repeated until the termination criteria have been satisfied.

1. The solution x1 is no worse than x2 in all objectives. 2. The solution x1 is strictly better tan x2 in at least one objective. If any of the above conditions is violated, the solution x1 does not dominate the solution x2 . If x1 dominates the solution x2 indicated: x1  x2. The formal definition of Pareto-optimal is as follows [2]: Is I = {1, ..., k} and Ω the set of feasible solutions, we say that a vector of decision variables ~x∗ ∈ Ω is Pareto-optimal, if for all ~x ∈ Ω are true: ∀i ∈ I : fi (x∗ ) ≤ fi (x) ∃j ∈ I : fj (x∗ ) < fj (x)

(3)

The Pareto Optimal Set P ∗ is defined by: P ∗ = {~x ∈ Ω|~xis Pareto-optimal}

(4)

The Pareto Front P F ∗ is defined by: P F ∗ = {f (~x) ∈ Rk |~x ∈ P ∗ }

(5)

3. NSGA-II Proposed by Deb, Agrawal, Pratap y Meyarivan en el 2000 [5]. This algorithm uses elitism, crowded tournament selection. A fast non-dominated sorting approach with O(mN 2 ) computational complexity is introduced, where m is the number of objectives and N is the population size. Crowded tournament selection is a selection mechanism based on tournament selection whereby, a group of individuals takes part in a tournament and the winner is judged by the fitness levels (a combination of rank and crowding distance) that each individual brings to the tournament. The crowding distance serves as an estimator of the perimeter of the cuboid formed by the nearest neighbors as the vertices. An explanation of the steps involved in NSGA-II is presented below: 1. A parent population called P t is randomly generated and an offspring population Qt is created from it. 2. Both populations P t and Qt and combined into population of size 2N where N is the population size. This new population is called Rt. 3. The population Rt undergoes non-dominated sorting where all members are classified and put into fronts.

4. OMOPSO M. Reyes y C. Coello [12] propose an approach based on Pareto dominance and the use of a crowding factor for the seletion of leaders. For each generation and each particle selects a leader. Selection is made by binary tournament based on the crowding value of the leaders. This proposal uses two external archives: one for storing the leaders currently being used for performing the flight and another one for storing the final solutions. The crowding factor is used to filter out the list of leaders whenever the maximum limit imposed on such list is exceeded. Only the leaders with the best crowding values are retained. Additionally, the authors propose a scheme in which they subdivide the population into three different subsets. A different mutation operator is applied to each subset. The algorithm 2 shows the pseudocode. Algorithm 1 OMOPSO 1: Initialize swarm 2: Locate leaders 3: Send leaders to  archive 4: crowding(leaders) 5: g = 0 6: while (g < gmax) do 7: for (i = 1 to particles) do 8: Select leader, flight, mutation 9: Evaluation, update pbest 10: end for 11: Update leaders 12: Send leaders to  archive 13: crowding(leaders) 14: g =g+1 15: end while 16: Report results in archive

5. Experimental Design In this section, we detail the parameter settings we have used, as well as the methodology followed in the experiments

5.1. Test functions The benchmarking MOPs chosen to evaluate the algorithms have been the ZDT problems [18]. For reasons of

space, the results show only 3 of them: ZDT1, ZDT2 and ZDT4. These problems have two objectives which are to be minimized. f1 (~x) = x1 (6) f2 (~x) = g(~x)h(f1 (~x), g(~x))

Where n is the number of vectors in the Pareto front found by the algorithm being evaluated, K is the number of objectives y d is the mean of all di . A value of SP = 0 indicates that all the nondominated solutions found are equidistantly spaced.

5.1.1. ZDT1. This is a 30-variable (n = 30) problema having a convex Pareto-optimal set, x1 ∈ [0, 1]: Pn 9 i=2 xi g(~x) = 1 + n−1 s (7) f1 (~x) h(f1 (~x), g(~x)) = 1 − g(~x)

5.2.3. Generational Distance (GD). This measure was proposed by V. Veldhuizen and Lamont [15]. Indicates how far are the elements in the Pareto front produced by the algorithm from those in the true Pareto front of the problem. This measure is defined as: pPn 2 i=1 di (12) DG = n

5.1.2. ZDT2. This is an n = 30 variable problem having a nonconvex Pareto-optimal set, x1 ∈ [0, 1]: Pn 9 i=2 xi g(~x) = 1 + n−1  2 (8) f1 (~x) h(f1 (~x), g(~x)) = 1 − g(~x) 5.1.3. ZDT4. This is an n = 10 variable problem having a convex Pareto-optimal set, x1 ∈ [0, 1] and x2 , , xn ∈ [−5, 5]. Pn g(~x) = 1 + 10(n − 1)s + i=2 (x2i − 10cos(4πxi )) f1 (~x) h(f1 (~x), g(~x)) = 1 − g(~x) (9)

5.2. Metrics of performance For our comparative study, we implemented three unary and two binary measures of performance. 5.2.1. Error ratio(ER). This measure was proposed in [16], indicates the percentage of solutions that are not members of the true Pareto optimal set. Pn ei ER = i=1 (10) n Where n is the number of vectors in the current set of nondominated vectors available; ei = 0 if vector i is a member of the Pareto optimal set, and ei = 1 otherwise. ER = 0 indicates an ideal behavior. 5.2.2. Spacing (SP). This measure was proposed [14] as a way of measuring the distance variance of neighboring vectors in the Pareto front known. This measure is defined v as: u n u 1 X SP = t (d − di )2 n − 1 i=1 (11) ! K X k k di = mini,j6=i |fi − fj | k=1

Where n is the number of nondominated vectors found by the algorithm being analyzed and di is the Euclidean distance between each of these and the nearest member of the true Pareto front. A value of GD = 0 indicates that all the elements generated are in the true Pareto front of the problem. 5.2.4. Two Set Coverage (SC). This measure was proposed in [18], is the relative coverage comparison of two sets. Consider X 0 y X 00 as two sets of vectors. SC is defined as: |a00 ∈ X 00 ; ∃a0 ∈ X 0 : a0  a00 | (13) SC(X 0 , X 00 ) = |X 00 | If all points in X 0 dominate or are equal to all points in X”, then by definition SC(X 0 , X 00 ) = 1. When SC(X 0 , X 00 ) = 1 and SC(X 00 , X 0 ) = 0, we say that X 0 is better than X 00 . 5.2.5. Two Set Difference Hypervolume (HV). This measure was proposed in [17]. Consider X 0 y X 00 as two sets of vectors. HV is defined as: HV (X 0 , X 00 ) = δ(X 0 + X 00 ) − δ(X 00 ) (14) Where X 0 + X 00 is defined as the nondominated vectors obtained from the union of X 0 and X 00 . δis the hypervolume measure. δ(X) is defined as the hypervolume of the portion of the objective space that is dominated by X. If HV (X 0 , X 00 ) = 0 and HV (X 00 , X 0 ) < 0 indicates that X 00 is better than X 0 .

5.3. Parameterization To compare and evaluate the quality of the results we performed 30 runs of each algorithm for each test function. In addition to the quantitative comparison, shows the Pareto fronts obtained for each problem which correspond to the median with respect to the DG measure. The Pareto fronts shown were also used to apply the binary measures of performance.

For both algorithms, real coding was adopted, with a population of 100 individuals. Was limited to 25,000 the number of evaluations of each objective function. The parameters for NSGA-II were a crossover probability (pc) equal to 0.8; a mutation probability (pm) equal to 1/codesize, where codesize is equal to the number of decision variables of each problem; crossover operator (ηc ) equal to 15 and mutation operator (ηm ) equal to 20. The parameters for OMOPSO were a mutation probability (pm) equal to 1/codesize; W = random(0.1, 0.5); C1, C2 = random(1.5, 2.0) and r1, r2 = random(0.0, 1.0). With reference to the binary measures [11], we will say that an algorithm A is relatively better than algorithm B when SC(A, B) > SC(B, A) and almost better than B when SC(B, A) = 0 and SC(A, B) > 0.9. The values of the HV , we will say that an algorithm A is better than algorithm B when HV (A, B) < 0 and HV (B, A) = 0.

Figure 1. Pareto fronts obtained by the algorithms for test function ZDT1.

6. Results In this section, we discuss the results obtained by the algorithms. In the figures shown, the Pareto front of the right corresponds to NSGA-II and the left to OMOPSO.

6.2. ZDT2 Table 2. Results obtained for functions ZDT2 for NSGA-II and OMOPSO

6.1. ZDT1 Table 1. Results obtained for functions ZDT1 for NSGA-II and OMOPSO best worst ER median average Std. Desv. best worst GD median average Std. Desv. best worst SP median average Std. Desv. SC(X, NSGA-II OMOPSO HV(X, NSGA-II OMOPSO

spect to ER, GD and SP for test function ZDT1. The OMOPSO returns greater number of solutions belonging to the true Pareto front, and the minimum GD. Also the OMOPSO has a better spread of solutions that NSGA-II. Regarding the binary metrics we can conclude that OMOPSO is relatively better than NSGA-II, since SC(OM OP SO, N SGA − II) > SC(N SGA − II, OM OP SO). We cannot conclude that OMOPSO is better than the NSGA-II, based on the value of HV.

NSGA-II 0.25 0.89 0.6 0.55413 0.16449 0.000455 0.000731 0.000624 0.000602 0.0000752 0.00595 0.01440 0.01034 0.01014 0.00154 NSGA-II) —— 0.13 NSGA-II) —— 0.00146

OMOPSO 0.0 0.36 0.05 0.09433 0.08766 0.000412 0.000691 0.000603 0.000594 0.0000526 0.00664 0.00986 0.00849 0.00846 0.000958 OMOPSO) 0.01 —— OMOPSO) 0.00279 ——

From the results shows in Table 1, we can conclude that the algorithm OMOPSO obtained the best results with re-

best worst ER median average Std. Desv. best worst GD median average Std. Desv. best worst SP median average Std. Desv. SC(X, NSGA-II OMOPSO HV(X, NSGA-II OMOPSO

NSGA-II 0.12 0.71 0.44 0.4 0.13916 0.000444 0.0005503 0.000498 0.000497 0.0000268 0.00693 0.01030 0.00858 0.00858 0.00087096 NSGA-II) —— 0.16 NSGA-II) —— 0.00217

OMOPSO 0.0 0.23 0.03 0.05133 0.06323 0.000403 0.000512 0.000481 0.000473 0.0000261 0.00704 0.00947 0.00783 0.00794 0.000506 OMOPSO) 0.01 —— OMOPSO) 0.00282 ——

Based on the results shown in Table 2, we can conclude that OMOPSO obtained the best results with respect to ER, GD, SP . Concluding with that, the OMOPSO has less error rate, closer to the true Pareto front and a better spread of solutions.

Figure 2. Pareto fronts obtained by the algorithms for test function ZDT2.

ated. The value of SP indicates that the NSGA-II has a better spread of solutions for function ZDT4. Regarding the binary metrics we can conclude that OMOPSO is better than NSGA-II, since SC(OM OP SO, N SGA − II) = 1 and SC(N SGA − II, OM OP SO) = 0.0, which means that the OMOPSO dominates all solutions generated by the NSGA-II. We can conclude the same with the value HV , since HV (OM OP SO, N SGA − II) < 0 and HV (N SGA − II, OM OP SO) = 0.

Regarding the binary metrics we can conclude that OMOPSO is relatively better than NSGA-II for ZDT2 function, since SC(OM OP SO, N SGA − II) is greater than the value obtained for the opposite case.

6.3. ZDT4 Table 3. Results obtained for functions ZDT4 for NSGA-II and OMOPSO best worst ER median average Std. Desv. best worst GD median average Std. Desv. best worst SP median average Std. Desv. SC(X, NSGA-II OMOPSO HV(X, NSGA-II OMOPSO

NSGA-II 1.0 1.0 1.0 1.0 1.0 0.03072 0.11512 0.06454 0.06629 0.02499 0.00858 0.03048 0.01434 0.014995 0.00565 NSGA-II) —— 1.0 NSGA-II) —— -0.09818

OMOPSO 1.0 1.0 1.0 1.0 1.0 0.02993 0.41516 0.09786 0.11863 0.08811 0.00645 0.49180 0.04142 0.08465 0.11923 OMOPSO) 0.0 —— OMOPSO) 0.0 ——

As shown in Figure 3, neither of the two algorithms was able to converge on the Pareto optimal front for the function ZDT4 for that reason the values of ER are 1.0 anyway. Regarding the GD metric we can conclude the following: GD is lower in the OMOPSO that the NSGA-II for the best, but considering the average and standard deviation, we conclude that the NSGA-II gets better performance for this function in this metric. As shown in Figure 3, the Pareto front of OMOPSO has a portion of objective space where solutions are not gener-

Figure 3. Pareto fronts obtained by the algorithms for test function ZDT4.

7. Conclusions In this article we have performed an experimental comparison between two algorithms for multi-objective optimization. To evaluate the performance of the algorithms we used three well-known benchmark problems (ZDT) by three unary metrics and two binary metrics. According to experiments with test functions examined and the parameter settings used, we can conclude that OMOPSO has the best performance. The difference in the values of the unary metrics in both algorithms is not significant. The binary values of the metrics indicate that OMOPSO is relatively better than the NSGA-II in two test functions and better in one test function. As part of our future work, we plan to carry out a comparative study, using restricted test functions.

References [1] C. Coello. Evolutionary algorithms for solving multiobjective problems. 2007. [2] C. Coello, G. Lamont, and D. Van Veldhuizen. Comprehensive Survey of Evolutionary-Based Multiobjective Optimization Techniques. 1999. [3] N. Cruz-Corts and C. A. Coello. Multiobjective optimization using the clonal selection principle immune system. 9th. Electrical Engineering Conference, pages 470–477, 2003. [4] K. Deb, S. Agrawal, A. Pratap, and C. Meyarivan. A fast elitist non-dominated sorting genetic algorithm for multiobjective optimization: Nsga-ii. Proceedings of the Parallel Problem Solving from Nature VI Conference, pages 849– 858, 2000.

[5] K. Deb and N. Srinivas. Multiobjective optimization using nondominated sorting in genetic algorithms. Evolutionary Computation, pages 221–248, 1994. [6] J. Durillo, J. Garca, A. Nebro, C. Coello, F. Luna, and E. Alba. Multi-objective particle swarm optimizers: An experimental comparison. 5th International Conference on Evolutionary MultiCriterion Optimization (EMO2009), 2009. [7] C. Fonseca and P. Fleming. Genetic algorithms for multiobjective optimization: Formulation, discussion and generalization. Proceedings of the Fifth International Conference on Genetic Algorithms, pages 416–423, 1993. [8] C. Garca, O. Cordn, and F. Herrera. An empirical analysis of multiple objective ant colony optimization algorithms for the bi-criteria tsp. ANTS Workshop, pages 61–72, 2004. [9] J. Horn, N. Nafpliotis, and D. Goldberg. A niched pareto genetic algorithm for multiobjective optimization. Proceedings of the First IEEE Conference on Evolutionary Computation, pages 82–87, 1994. [10] J. Knowles and D. Corne. The pareto archived evolution strategy: A new baseline algorithm for multiobjective optimization. Congress on Evolutionary Computation, pages 98–105, 1999. [11] M. Reyes. Use of Coevolution and Fitness Inheritance for Multi-Objective Particle Swarm Optimization. PhD thesis, Center of Research and Advanced Studies of the National Polytechnic Institute, Mexico City, 2006. [12] M. Reyes and C. Coello. Improving pso-based multiobjective optimization using crowding, mutation and edominance. In Evolutionary Multi-Criterion Optimization (EMO 2005), pages 505–519, 2005. [13] J. Schaffer. Multiple Objective Optimization with Vector Evaluated Genetic Algorithms. Unpublished Ph.D. thesis, Vanderbilt University, Nashville, Tennessee, 1984. [14] J. Schott. Fault Tolerant Design Using single and Multicriteria Genetic Algorithm Optimization. Master thesis, Department of Aeronautics and Astronautics, Institute of Technology, Cambridge, Massachusetts, 1995. [15] A. Veldhuizen and G. Lamont. On measuring multiobjective evolutionary algorithm performance. 2000 Congress on Evolutionary Computation, pages 204–211, 2000. [16] D. Veldhuizen. Multiobjetive Evolutionary Algorithms: Classifications, Analyses and New Innovations. Ph.D. thesis, Department of Electrical and Computer Engineering , Air Force Institute of Technology, Ohio, 1999. [17] E. Zitzler. Evolutionary Algorithms for Multiobjective Optimization: Methods and Applications. PhD thesis, Swiss Federal Institute of Technology, Zurich, Switzerland, 1999. [18] E. Zitzler, K. Deb, and L. Thiele. Comparison of multiobjective evolutionary algorithms: Empirical results. Evolutionary Computation, pages 173–195, 2000. [19] E. Zitzler, M. Laumanns, and L. Thiele. SPEA2: Improving the Strength Pareto Evolutionary Algorithm. Technical Report 103, Swiss Federal Institute Technology, Zurich, Switzerland, 2001. [20] E. Zitzler and L. Thiele. Multiobjective evolutionary algorithms: A comparative case study and the strength pareto approach. IEEE Transactions on Evolutionary Computation, pages 257–271, 1999.

Lihat lebih banyak...

Comentarios

Copyright © 2017 DATOSPDF Inc.