
MultiObjective
Evolutionary Algorithms
Theory and Application to Problems in Code Scheduling
and Hardware Synthesis
Abstract
Evolutionary algorithms possess several characteristics
that are desirable for problems involving i) multiple conflicting objectives,
and ii) intractably large and highly complex search spaces. This is typical
for many optimization problems in the field of computer engineering. Consider
the design of a computer system. An optimal design might be an architecture
that minimizes cost while minimizing the overall power consumption. However,
these goals are generally conflicting: lowpower architectures substantially
increase cost, while cheap architectures usually need a lot of power  none
of these solutions can be said to be superior without further consideration.
As a consequence, there is no single optimum, but rather a set of alternative
optima, generally known as Paretooptimal solutions. Evolutionary algorithms
are able to capture multiple Paretooptimal solutions in a single simulation
run. Obtaining high convergence and diversity of solutions in low computational
time are the important issues in multiobjective optimization using evolutionary
algorithms.
Examples
of our current research works are given below.
One problem in MOEA is that an evolutionary algorithm
may loose the best solution during generations by performing recombination
among the solutions. A remedy to this problem are the so called Elitist MOEAs
that store the nondominated solutions of each generation in an archive.
The problem resulting from archives, however, is that in each generation
the archive needs to be updated. The number of comparisons increases specially
when the number of individuals or when the size of the archive increases.
Here we investigate different kinds of data structures like Quadtrees
and linear lists for realizing the archives.

MOPSO 
Different from
evolutionary computation techniques, Particle Swarm Optimization (PSO) method
is motivated from the simulation of social behavior of bird flocking and
fish schooling. PSO was originally designed and developed by Eberhart and
Kennedy. However, it shares many similarities with evolutionary computation
techniques. The system is initialized with a population of random solutions
and searches for optima by updating generations. Unlike EA, PSO has no evolution
operators such as crossover and mutation. In PSO, the potential solutions
fly through the problem space by following the current optimum.
In PSO, each single solution is a "bird" in the search space. We call it "particle".
All of particles have fitness values which are evaluated by the fitness function
to be optimized and have velocities which direct the flying of the particles.
The particles fly through the problem space by following the current optimum
particle called guide.

Changing a PSO to optimize a multiobjective problem
requires a redefinition of what a guide is in order to obtain a front of
optimal solutions. In MultiObjective Particle Swarm Optimization (MOPSO),
the Paretooptimal solutions should be used to determine the guide for each
particle. But selecting the guide (the best local guide) from the set of
approximated Paretooptimal solutions for each particle of the population
is very difficult yet an important problem for attaining convergence and
diversity of solutions. Here, we propose the Sigma method for finding the
local best guides.
 Download the program of the Sigma MOPSO
 Download an example of moving particles
Other researches on MOEA and MOPSO:
 Covering approximated Paretooptimal fronts
 Diversity metric Sigma diversity metric
 Hybrid MOEA: Hybrid MOEA (HMOEA)
is a combination of MOEA with Subdivision method to obtain controllable exploration
of the search space. Here, we cover the approximated Paretooptimal front
of some multiobjective problems.
Hierarchical Chromosomes in System Synthesis 
We propose an approach for solving hierarchical multiobjective optimization
problems (MOPs). In realistic MOPs, two main challenges have to be considered:
(i) the complexity of the search space and (ii) the nonmonotonicity of the
objectivespace. Here, we introduce a hierarchical problem description (Chromosomes)
to deal with the complexity of the search space. Since evolutionary algorithms
have been proved to provide good solutions in nonmonotonic objectivespaces,
we apply genetic operators also on the structure of hierarchical Chromosomes
This novel approach decreases exploration time substantially.



Visual Programming of Evolutionary
Algorithms  While an evolutionary algorithm
is a powerful optimization concept, one of its drawbacks is the difficulty
of implementing it. Users would require some programming expertise to write
a computer program that implement their algorithm according to their need.
This has to be done before they can carry out their design task, where they
should be really engaged in. A simple solution to this problem has been addressed
in our research group. In the right Figure a graphical user interface is
shown, where a user can construct his particular evolutionary algorithm graphically,
including entering the fitness function, chromosome structure, choosing operators
for selection and recombination as well as drawing the flowchart of the complete
evolutionary algorithm. Even hybrid combinations of different evolutionary
algorithm methods such as Genetic Algorithms operating on bitstrings and Genetic
Programming using treelike Chromosomes can be achieved. For the graphical
input, java code will be emitted and the ready evolutionary algorithm is
compiled and can be run directly. The program has a comprehensive user interface
and also powerful graphical displays for easeofuse and visualization of
simulation results.

Publications
Fehler bei der Datenbankverbindung 