Wednesday, May 05, 2010

Rosenbrock Function And The Steepest Descent

It's interesting how the steepest descent method sweats a lot if not correctly set up inside the "right" parameters' interval of the Rosenbrock Function:

It's a well known testbed optimization function and evolutionary algorithms can find its optimum with no problems at all -- surely, as long as you try a traditional parameters' setting that any introductory Evolutionary Computation book gives you.

Using the steepest descent and setting it up as:

x = 2

y = 2

Step Size = 0.001

Stop Criterion = 10-6

I got the following results (see the image below):

Pay attention to the path the steepest descent takes until finding the optimum at x = 1 and y = 1. It is said that an evolution strategy would follow a similar path if its population could be infinite. Since, so far, there are no real computers with infinite memory, such assumption cannot be verified in the real world.

During a second run I set the parameters x = 5 and y = 5. The optimization path run very far away from the previous one! And from the optimum too!

Labels: , , , , , , , , ,

Charles Darwin Has A Posse Check Google Page Rank