Friedrich-Alexander-Universität DruckenUnivisEnglish FAU-Logo
Techn. Fakultät Willkommen am Department Informatik FAU-Logo
Lehrstuhl für Informatik 12
Department Informatik  >  Informatik 12  >  Forschung  >  Evolutionäre Algorithmen
Multi-Objective Evolutionary Algorithms

Theory and Application to Problems in Code Scheduling and Hardware Synthesis


Evolutionary algorithms possess several characteristics that are desirable for problems involving i) multiple conflicting objectives, and ii) intractably large and highly complex search spaces. This is typical for many optimization problems in the field of computer engineering. Consider the design of a computer system. An optimal design might be an architecture that minimizes cost while minimizing the overall power consumption. However, these goals are generally conflicting: low-power architectures substantially increase cost, while cheap architectures usually need a lot of power - none of these solutions can be said to be superior without further consideration. As a consequence, there is no single optimum, but rather a set of alternative optima, generally known as Pareto-optimal solutions. Evolutionary algorithms are able to capture multiple Pareto-optimal solutions in a single simulation run. Obtaining high convergence and diversity of solutions in low computational time are the important issues in multi-objective optimization using evolutionary algorithms.

Examples of our current research works are given below.

One problem in MOEA is that an evolutionary algorithm may loose the best solution during generations by performing recombination among the solutions. A remedy to this problem are the so called Elitist MOEAs that store the non-dominated solutions of each generation in an archive. The problem resulting from archives, however, is that in each generation the archive needs to be updated. The number of comparisons increases specially when the number of individuals or when the size of the archive increases. Here we investigate different kinds of data structures like Quad-trees and linear lists for realizing the archives.

Swarm MOPSO - Different from evolutionary computation techniques, Particle Swarm Optimization (PSO) method is motivated from the simulation of social behavior of bird flocking and fish schooling. PSO was originally designed and developed by Eberhart and Kennedy. However, it shares many similarities with evolutionary computation techniques. The system is initialized with a population of random solutions and searches for optima by updating generations. Unlike EA, PSO has no evolution operators such as crossover and mutation. In PSO, the potential solutions fly through the problem space by following the current optimum.  In PSO, each single solution is a "bird" in the search space. We call it "particle". All of particles have fitness values which are evaluated by the fitness function to be optimized and have velocities which direct the flying of the particles. The particles fly through the problem space by following the current optimum particle called guide.

Changing a PSO to optimize a multi-objective problem requires a redefinition of what a guide is in order to obtain a front of optimal solutions. In Multi-Objective Particle Swarm Optimization (MOPSO), the Pareto-optimal solutions should be used to determine the guide for each particle. But selecting the guide (the best local guide) from the set of approximated Pareto-optimal solutions for each particle of the population is very difficult yet an important problem for attaining convergence and diversity of solutions. Here, we propose the Sigma method for finding the local best guides.

- Download the program of the Sigma MOPSO
- Download an example of moving particles

Other researches on MOEA and MOPSO:

  • Covering approximated Pareto-optimal fronts
  • Diversity metric- Sigma diversity metric
  • Hybrid MOEA: Hybrid MOEA (HMOEA) is a combination of MOEA with Subdivision method to obtain controllable exploration of the search space. Here, we cover the approximated Pareto-optimal front of some multi-objective problems.

Hierarchical Chromosomes in System Synthesis -  We propose an approach for solving hierarchical multi-objective optimization problems (MOPs). In realistic MOPs, two main challenges have to be considered: (i) the complexity of the search space and (ii) the non-monotonicity of the objective-space. Here, we introduce a hierarchical problem description (Chromosomes) to deal with the complexity of the search space. Since evolutionary algorithms have  been proved to provide good solutions in non-monotonic objective-spaces, we apply genetic operators also on the structure of hierarchical Chromosomes
This novel approach decreases exploration time substantially.

Hierarchical Chromosomes
Easy EA Programming
Visual Programming of Evolutionary Algorithms - While an evolutionary algorithm is a powerful optimization concept, one of its drawbacks is the difficulty of implementing it. Users would require some programming expertise to write a computer program that implement their algorithm according to their need. This has to be done before they can carry out their design task, where they should be really engaged in. A simple solution to this problem has been addressed in our research group. In the right Figure a graphical user interface is shown, where a user can construct his particular evolutionary algorithm graphically, including entering the fitness function, chromosome structure, choosing operators for selection and recombination as well as drawing the flowchart of the complete evolutionary algorithm. Even hybrid combinations of different evolutionary algorithm methods such as Genetic Algorithms operating on bitstrings and Genetic Programming using tree-like Chromosomes can be achieved. For the graphical input, java code will be emitted and the ready evolutionary algorithm is compiled and can be run directly. The program has a comprehensive user interface and also powerful graphical displays for ease-of-use and visualization of simulation results.


Fehler bei der Datenbankverbindung