### The DeJong Test Suit F7 - The Schaffer Fuction

Hello!! :D

Here I'm again to show you a little about the Evolutionary Computation World. I was looking for old simulations that I had made and I found a very interesting thing: The Schaffer Function F7. This function was used by DeJong, one of the Genetic Algorithms pioneers, to test the behaviour and performance of his GA's.

The problem that I faced was one of minimization. I used an Evolution Strategy, the configuration of the ES is this:

Population = 60

Offspring = 60 (I used the (mu+mu)ES )

Generation = 5000

Simulation Time = 12.5 seconds

The Schaffer Function F7 is this:

F(x,y) = (x^2 + y^2)^0.25 * (sin^2(50*(x^2+y^2)^0.1 + 1))

x = [-100:100]

y = [-100:100]

Below there are two function's graphs.

Here we have a side view of the function

The function watched from the top

The best values that I found were:

F(x,y) = 0.055245410615999468

x = 0.0022735653858623633

y = 0.0020354408287815693

The global minimum is F(x,y) = 0 with x = 0 and y = 0. I think that I got a "good" solution. :D

See You!!

[]´s

Nnosophorus

Here I'm again to show you a little about the Evolutionary Computation World. I was looking for old simulations that I had made and I found a very interesting thing: The Schaffer Function F7. This function was used by DeJong, one of the Genetic Algorithms pioneers, to test the behaviour and performance of his GA's.

The problem that I faced was one of minimization. I used an Evolution Strategy, the configuration of the ES is this:

Population = 60

Offspring = 60 (I used the (mu+mu)ES )

Generation = 5000

Simulation Time = 12.5 seconds

The Schaffer Function F7 is this:

F(x,y) = (x^2 + y^2)^0.25 * (sin^2(50*(x^2+y^2)^0.1 + 1))

x = [-100:100]

y = [-100:100]

Below there are two function's graphs.

Here we have a side view of the function

The function watched from the top

The best values that I found were:

F(x,y) = 0.055245410615999468

x = 0.0022735653858623633

y = 0.0020354408287815693

The global minimum is F(x,y) = 0 with x = 0 and y = 0. I think that I got a "good" solution. :D

See You!!

[]´s

Nnosophorus

## 3 Comments:

Hi!

A few questions ... what kind of ES did you used?! (a,b) or (a+b)? Didn't you turn the mutation off to see if you find a the optimum?

And another unrelated point ... do you apply evolutionary methods on some real-world application?

Hello!!

I'm so glad that you had made a comment in my blog, Genetic Argonaut. :D

Now, answering your questions:

1 - "What kind of ES did you used?! (a,b) or (a+b)?"

My ES is configured by this way:

- Parents Population = 60

- Offspring Population = 60

- Total of Generations = 5000

- Kind of ES = (a+b)ES, because each parent generates one offspring and the selection occurs in the population formed by the parents + offspring. :D

2 - "Didn't you turn the mutation off to see if you find a the optimum?"

- I made the Gaussian Mutation with some modifications. Those modifications allows the ES to work better inside the variables' intervals. With those modifications the ES can (or cannot) escape from a place that is a local optimum. But I need to make more simulations to verify the mutation operator's behaviour.

3 - "And another unrelated point ... do you apply evolutionary methods on some real-world application?"

- So far I did not do this, because my advisor is finishing his Phd and before I had studied Neural Networks. I started with EC around one year ago, but when my advisor come back next August, he and me will use EC to evolve Neural Networks in weather forecasting. I'm an Undergraduate student interested in Artificial Intelligence. :D

I hope that I had answered your questions. I'm very glad that you had asked them. :D

But, now, let me ask you some questions: :D

What problems did you face using EC ? :D

Did you try the DeJong Test Suit with your EC algorithms ? :D

See You!! :D

Hi Marcelo!

Thanks for your answers. I find what I wanted! (: I hope you become successful in evolving that NN for weather forecasting. I think that is a nice and useful subject to work. I suggest you to be aware of overfitting that may cause by finding the very optimal parameters of NN.

About my work on EC: well! I have done some experiences ranging from some simple one or two dimensional optimization (something like DeJong or ... functions or over simpler), to applications like evolving multi-layer dielectric absorbers, evolving hidden markov models (HMM), some preliminary research on analog circuit design and evolving robot controllers. This latter is the most recent one that I used both evolution and learning for designing a behavior-based controller.

Post a Comment

<< Home