2

Hi I am absolute beginner and I've got rather theoretical question about iterative optimization in genetic algorithms.

Where in the genetic cycle (see below) does the iterative optimization logically belong? I am not really sure, but I think it could be in population initialization or mutation, depending on given problem.

To be more specific about about the iterative optimization algorithm - I would like to use "hill climbing" or "simulated annealing".

I use this model as reference:

enter image description here

manlio
19.2k14 gold badges83 silver badges139 bronze badges
asked Apr 5, 2015 at 16:32

1 Answer 1

2

Well, there are several possibilities and basically all of them make sense.

If you put the optimisation phase after population initialization (being run once before the genetic algorithm itself) then what you get is an already optimized initial population. This might be useful because the genetic algorithm does not have to search so much, but it may be harmful in a way that you optimize to some local optima, possibly losing useful information in the process.

If you put the optimisation phase before selection, you get a so-called memetic algorithm. MA is based on the fact that an organism learns throughout its lifetime (the optimization). You have two possibilites of how to do that:

  1. Take an individual, optimize it and replace the original individual with its optimized version. This is called "Lamarckian evolution" and is based on the idea (originally by Jean-Baptiste Lamarck at the beginning of 19th century) that the learned features can be passed on to the offsprings.

  2. Take an individual, optimize it, but then throw the optimized individual away but assign its fitness to the original, unoptimized individual. In this variant the optimization effectively becomes the part of the fitness evaluation process. This is called "Baldwin effect" and is based on the idea (originally by James Mark Baldwin at the end of 19th century) that the learned features cannot be passed on to the offsprings and the genetic information rather describes the ability to learn. By the way, this is the way the natural evolution actually works.

The optimization can, of course, be placed in all the other places, but it is not used (at least I don't know of any such case). However, your problem and mutation/crossover operators might be such that the optimization in those places might be somehow beneficial. Or you can just try and see what you get.

answered Apr 5, 2015 at 22:10
Sign up to request clarification or add additional context in comments.

Comments

Your Answer

Draft saved
Draft discarded

Sign up or log in

Sign up using Google
Sign up using Email and Password

Post as a guest

Required, but never shown

Post as a guest

Required, but never shown

By clicking "Post Your Answer", you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.