Wednesday, June 22, 2005

The Blind Man and The Genetic Algorithms + Neural Networks

Yes, again the guy who is teaching the Performance Analysis course said that GA's and Neural Networks are things not so good to use, and that he prefers to use his Markovian Models, he even told us that there is another guy that made a Phd thesis trying to explain that all the Universe, since the DNA untill the Black Holes, is a Markovian phenomenon. Oh, I have to remember you that that guy who told those things about GA's and NN's is supervised by the same Professor that said that GA's, Neural Networks, Fuzzy Logic and other Soft Computing approaches are all fag-things (Gay things).

But the most comic moment was when that guy turned to my friend, that also studies GA's and NN with me, and said: "Once I was rading the last chapter of a Neural Networks book and this last chapter is exactly the first one of my Phd Thesis. So, you, boy, that studies those things, would lose a job to me, if we have to compete to gain the job.".

Gosh, Its a little dificult to me believe that, so far, still there are those kind of thinking, almost 50 years has passed since the pioneer EC works from the 1950's and more than 40 years since the meeting of Ingo Rechenberg and Hans-Paul Schwefel at TUB. The same time is valid to John Holland and Lawrence Fogel. Once I was reading an Interview with David Goldberg in EvoWeb and he said that when he began to study GA's, the other persons saw him as joining some obscure religion and that his GA application to a pipeline gas was received with laughs in the review line of the Journal of Hydraulic Engineering. But it was almost 25 years ago. A similar situation happened with Hans-Paul Schwefel and Ingo Rechenberg when they got their degree at TUB and they were instructed to come back to study Fluid Dynamics or they would have to leave TUB. So, both decided to leave. But those facts happened a long time ago and I have some dificulties to understand how, so far, there are some persons that acts against GA's, NN and other Soft Computing approaches in the same way of those another persons that were against the pioneer GA's work.

But, anyway...

See You. :D


Saturday, June 11, 2005

Biomimetic Concept Car

What a nice car!!! Take a look:

This is a bio-inspired design car, but the car is not only beautiful, it has some aerodynamics optimizations.

Well, I agree with Ingo Rechenberg, Der Bioniker, that we have a lot to learn with Nature.



Monday, June 06, 2005

The Michalewicz Test Function


Going on with the Test Function Suit, today I have one function that is very interesting: The Michalewicz Test Function.

That function is writen by this way:

F(x1,x2) = -(sin(x1)*(sin(2*x1^2/pi))^(2*10))-(sin(x2)*(sin(2*x2^2/pi))^(2*10))
x1 = [0:pi]
x2 = [0:pi]

I don't know where the global minimum is located, but I think that it is as the graphs points: (x1,x2) = (pi/2, pi/2) what gives us F(x1,x2) = -2.

I used again an ES with the same configuration fo those ES below.

Population = 60
Offspring = 60 (I used the (mu+mu)ES )
Generation = 5000
Simulation Time = 12.82799 s

Below we have some graphs.

The function:

Free Image Hosting at

Another view:

Free Image Hosting at

Here you have a map of the function:

Free Image Hosting at

The search graph:

Free Image Hosting at

I began with the best values of:
F(x1,x2) = -1.9467647907315375
x1 = 1.553340240456647
x2 = 1.6026577871222381

After 5000 generations I got:

F(x1,x2) = -1.9999999999954916
x1 = 1.5707950628346909
x2 = 1.5707936022511222

I have to try the search with a GA.

See You!!


Sunday, June 05, 2005

The DeJong (De Jong) Test Suit F2

Hallo!! :D

Here another function from the The DeJong Test Suit.

The function is this:

F(x1,x2) = 100*(x1^2 - x2)^2 + (1 - x1)^2

The function is very simple and I got good values. The problem is to minimize the function. The global minimum is f(x1,x2) = 0 at (x1,x2) = (1,1).

ES Configuration:

Population = 60
Offspring = 60 (I used the (mu+mu)ES )
Generation = 5000
Simulation Time = 9.493 s

Below we have the function's graph:

Free Image Hosting at

Another view:

Free Image Hosting at

Now you see the ES performance. The horizontal axis is the number of generations ant the vertical axis is the best fitness.

Free Image Hosting at

A closer image from the image above:

Free Image Hosting at

The initial at first generation's values that I got were:

F(x1,x2) = 1.2143242479659295
x1 = -0.02275919812830051
x2 = 0.041540885751586681

The final values were:

F(x1,x2) = 1.6182695892245374e-008
x1 = 1.000118812318453
x2 = 1.0002330930642427

A good aproximation. :D

See You!!


P.S: Next post I will show the Colville Function. This one has four variables. :D

The Easom Function


Here I'm again to show another result that I got with my ES.

The function that I used this time was the Easom Function:

F(x1,x2) = -cos(x1)*cos*x2)*exp(-(x1-pi)^2 - (x2-pi)^2)
x1 = [-100:100]
x2 = [-100:100]

ES Configuration:

Population = 60
Offspring = 60 (I used the (mu+mu)ES )
Generation = 5000
Simulation Time = 6.99 s
The problem is to minimize the function above. The global minimum is -1 with x1 = pi and x2 = pi.

Below I show you my results.

Best initial values:
F(x1,x2) = -2.0079019279809812e-027
x1 = 4.405669730345565
x2 = 10.71696945361343

Best final values:
F(x1,x2) = -0.99999978685693391
x1 = 3.141922312441527
x2 = 3.1414098408067366

Here we have the Easom Function's graph:

Free Image Hosting at

A side view:

Free Image Hosting at

Here you see the evolution of the ES. The horizontal axis is the number of generations and the vertical axis is the best fitness value.

Free Image Hosting at

Well, I think that my ES found very good solutions to the Easom Function. :D

See You!!


Saturday, June 04, 2005

The DeJong Test Suit F7 - The Schaffer Fuction

Hello!! :D

Here I'm again to show you a little about the Evolutionary Computation World. I was looking for old simulations that I had made and I found a very interesting thing: The Schaffer Function F7. This function was used by DeJong, one of the Genetic Algorithms pioneers, to test the behaviour and performance of his GA's.

The problem that I faced was one of minimization. I used an Evolution Strategy, the configuration of the ES is this:

Population = 60
Offspring = 60 (I used the (mu+mu)ES )
Generation = 5000
Simulation Time = 12.5 seconds

The Schaffer Function F7 is this:

F(x,y) = (x^2 + y^2)^0.25 * (sin^2(50*(x^2+y^2)^0.1 + 1))

x = [-100:100]
y = [-100:100]

Below there are two function's graphs.

Here we have a side view of the function

Free Image Hosting at

The function watched from the top

Free Image Hosting at

The best values that I found were:

F(x,y) = 0.055245410615999468
x = 0.0022735653858623633
y = 0.0020354408287815693

The global minimum is F(x,y) = 0 with x = 0 and y = 0. I think that I got a "good" solution. :D

See You!!



Thursday, June 02, 2005

The First Genetic Argonaut Blog's Big Challenge!!!

Yes, my friends, I got it!! I'm in the middle of a Research Operations course at University and, so far, I feel so bored about it, I know that those methods are very useful, but to solve the problems that I have in my mind, I can tell you that the classical methods are not the most suitable to do the job.

So, to feel more confortable with the RO course, I challenged my RO professor with one problem to do optimization. The function is this:

f(x1,x2) = 21.5 + x1*sin(4*pi*x1) + x2*sin(20*pi*x2)

This function is the same of the graphics below and the intervals

x1 = [-3:12.1]
x2 = [4.1:5.8]

are the same that in the graphics too.

I told professor that I will use an Evolutionary Strategy and a Genetic Algorithm to find a good solution to x1 and x2. She, the professor, told me that she will use Fibonacci Search combined with Gradient Method. Let's expect until the next week to see the results. :D In the same bat-channel at the same bat-time. :D

Well, my friends, it will be a very hard and bloody fight until the death!!! :P

So, make your bets!!!

See You!!
Charles Darwin Has A Posse Check Google Page Rank