Saturday, March 29, 2008

Evolving Poetry Through Simulated Evolution

From Wired and posted at Bruce Sterling's blog, see here.

It is a genetic algorithm that evolves poetry. Each 200 generations a new population of poems is created. There is an interesting video outlining the genetic algorithm general working.

It reminds me of the famous Mark V. Shaney created by our blog friend Don P. Mitchell and put on USENET as a cyber-prank by Rob Pike and Bruce Ellis. See an instance here. I was told by Don Mitchell that his first experiments concerning Mark V. Shaney were performed using the Tao Te Ching text.

An excerpt from the surrealistic Mark V. Shaney composing ability:

"While at a conference a few weeks back, I spent an interesting evening with a grain of salt."

I wonder about the time when those programs really will compose human level literature. What about a Literature Nobel Prize for the next generation evolutionary algorithms?

Labels: , , , , , , , , , ,

Thursday, March 27, 2008

Professor Schwefel's Keynote At EvoStar

Our blog friend Juan Julián Merelo Guervós is posting about the EvoStar 2008 event.

His first post is a short summary concerning the opening keynote given by Professor Hans-Paul Schwefel, see it here.

His recent post summarizes a little bit more the keynote presented by Professor Schwefel, see here.

Very interesting posts!

Upon the future of evolutionary computation, Professor Schwefel said that "the unexpected should be expected".

Professor Schwefel also mentioned a situation when a paper reviewer commented the following: "Why should other optimization algorithms be necessary?" This comment is strongly correlated to traditional optimization methods and the development of the evolution strategies, since some researchers - along the 1960s and 1970s - thought that gradient-based methods and others techniques (such as Gauss-Seidel method) were all the stuffs they needed.

He also spoke of the application of evolution strategies (under the form of the so-called experimental optimization) to a two-phase flashing nozzle optimization, which was performed without computers! Even today, it is not possile to calculate what happens within such nozzle: thermodynamically far away from equilibrium, drag between slow water droplets and fast steam, three-dimensional turbulent boundary layer with liquid sublayer, supersonic behind nozzle throat, etc.

There were several challenges to be overtaken when preparing the grounds to the evolution strategy, such as self-adaptation, non-elitist selection method, parallelism, and so on. He obtained inspiration in natural systems to implement solutions to those problems.

An important comment that evolutionary computation researchers should pay attention:

"He's commented that evolutionary algorithms are getting less bio-inspired in time, this is not good or bad, but it's more interesting for him to look at models than to have super-tweaked ultra-tuned purportedly bio-inspired algorithms."

Labels: , , , , , , , , ,

Tuesday, March 25, 2008

Evolving Crystal Structure Through Simulated Evolution

From my news webservice I got this interesting sample of genetic algorithm application, see here.

An excerpt:

"[Julio] Facelli, director of the University of Utah's Center for High Performance Computing and a biomedical informatics professor, uses NCSA's Mercury cluster to predict crystal structures for organic molecules that are frequently used in pharmaceuticals, fertilizers, and explosives."

"Modeling the crystal structure of a given substance, Facelli and his team begin with nothing more than the atoms in the molecule and the nature of their bonds. They're looking for structures with the lowest energies, which typically mark the molecules' standard crystal structures or something very close. The problem is that this straightforward data and straightforward goal create billions of possible solutions. Just imagine finding the needle of the lowest energy in that haystack of possible structures."

"'An exhaustive search is not feasible, so we have to have a way to direct the search,' says Facelli. '[Genetic algorithms] are based on the principle of survival of the fittest. Trial solutions compete with one another in the population for survival and produce offspring for the next generation of tests. These algorithms offer excellent scaling properties, which make them good for large-scale parallel computing systems like those at NCSA and emerging computational grids like TeraGrid.'"

Here the article outlines through a simple manner the inner working of the genetic algorithm used:

"For example, an initial calculation on NCSA's Mercury may run 20 simulations on 20 different processors simultaneously, calculating possible crystal structures for a given set of atoms and their bonds. The energies for these structures are compared. The 10 with the lowest energies are kept, and the features of those 10 are mixed and matched to generate the structures for another 10 possible structures. Energies are calculated again, comparisons are made, best candidates are kept, and the cycle continues.

The 'mating operation,' as the mixing and matching is called, stagnates quickly, producing very similar structures over the course of thousands of generations. To combat this lack of variety, the genetic algorithm also introduces arbitrary mutations into the process, occasionally taking one variable from one of the best candidates and including a random number for that variable in the next generation."

Stagnation has been, since immemorial times, a hard drawback to overcome, mainly when using an elitist selection method. Taking into account that those individuals holding the lowest levels of energy are the best ones, then it is, in some sense, normal that the genetic algorithm got stuck quickly. Surely, there already are techniques to treat that, such as truncation selection.

If I were the author and considering he is using mutation and crossover, I would set up a high mutation rate (0.85 to 0.95) and a mild crossover rate (0.45-0.6), of course I would put the elitist rate (the number of fittest individuals saved per generation) as a rate of 1/7 of population size. Inversion could help too!

Labels: , , , , , , , ,

Thursday, March 20, 2008


Very good point from the masterminds behind the Grey Thumb blog. It's a link to Dr. Deborah Gordon's lecture on how ants know to do what they do, see here.

Understanding the inner working of ants' behaviour - and others swarms too - may be important to further the current state of those bio-inspired methods relying on that.

I read her book Ants at Work: How an Insect Society is Organized. She even applied neural networks to model the way that ants interact.

Maybe, the main lesson we may learn from those little insects is directly linked to the fact that ants (and other kinds of swarms) do what they do without a central control and the nest works fine. Sure, when it comes to the evolutionary computation realm, it is always sensible to remember that ants' nests own a very big population and in optimization (and machine learning in general) the population size may be a drawback if set up so big in some problems. Let alone that keeping a small population is very useful when dealing with expensive fitness/objective function evaluations.

Labels: , , , , , ,

Evolving Electrical Circuits Through Simulated Evolution

Another application of genetic algorithm to optimize circuit components' values, see here.

The resistor's and NTC thermistor's values must be set up such that the resulting curve (from the circuit equation) matches the curve on the chart below:

At a first glance, it seems a very simple problem, but, as the article observes, it may get hard to solve as the circuit complexity increases.

It remembers me of a tale of a senior student in a class of electrical circuits. The professor put some hard circuit problems to be solved through variable substitutions and so on. There were so many derivatives to be solved and etc. The senior student used a tricky way to solve them:

01. He organized all the equations.

02. He used the Laplace transform to all of them.

03. He put all the parameters on a matrix.

04. He solved all the equations through the Gauss-Jordan method.

I was not there, but some "persons" state that the professor refused his solution!!!

I wonder if that professor is the same guy which once said that 1/2 Ohm is not the same as 0.5 Ohm!!!!

Labels: , , , , , ,

Three Years!!!!

Hi, There!!!!

Today is the third anniversary of this humble and simple blog! :)

I acknowledge the old friends and the new ones too! :)

I hope that you have enjoyed some posts I have made along this time. I have learned a lot writing them.

By the way, it's nice to welcome the new friends, such as Juan Julián Merelo Guervós, Nojhan, Hrafn Thorri Thórisson, Michael Trick, and others!! :)

Thank You! :)

Labels: , , , , , ,

Saturday, March 01, 2008

Evolving Grid Computing Optimization Through Simulated Evolution (Updated)

Interesting stuff I got via my webnews service: Evolving towards the future of science: genetic algorithms and grid computing.

The main objective has to do with the optimization of job scheduling on a computing grid. Through this, lots of valuable resources are saved, such as energy consumption, CPU time, human resources, and so on.

It seems that the genetic algorithm (GA) applied is the good old CGA or Canon Genetic Algorithm (likely holding some slight modifications, for instance elitism, stochastisc tournament selection, etc.), what stands for bit flip mutation, one point crossover and a fitness based selection scheme. Of course, I may be wrong about that, since I have not read nothing (except that news) upon the genetic algorithm they use for that task.

From the article:

"Standing for Distributed Optimal GENEtic algorithm for Grid applications Scheduling, DIOGENES quickly determines the most efficient way to schedule of a set of jobs on a computing grid, optimizing both time and resources in the process."

I have noticed an interesting (strange?) phenomenon which has taken place in the, let's say, New Wave Of Genetic Algorithms (NWOGA): It seems as though no one is applying them! From my surveys inside the Internet and looking for nice real world applications of evolutionary algorithms, what I have noticed is that when it comes to genetic algorithms, the most applied one still is the... Canon Genetic Algorithm! Surely, as stated above, that CGA which is being applied to solve a particular problem, very frequently, holds modifications, be these on the selection scheme, the evolutionary operators (mutation, crossover, and, from time to time, inversion), population size, or even on the probability tunning of mutation and crossover. The fiddling CGA game has generated, sometimes, strange algorithms.

By the way, I have nothing against NWOGA.

Our blog friend, Julian Togelius, wrote an interesting observation upon that situation. I consider that he aimed at the right target, see:

"I think the reason so few people are using other types of evolutionary algorithms than standard GAs is that few people know of, or understand, anything else. It's really amazing how little people know of what's going on next doors to their own little research field.


So, the people who are busy coming up with new algorithms don't have much time for learning about applications, and vice versa..."

Absolutely! :)

The big academic/technical organizations/publishers (ACM, IEEE, Springer, etc.) could sponsor more crossover events in which different research areas could come a little closer to share their respective experiences, new ideas and so on trying to find common points so that one area could help the other. Perhaps, that attitude could bring more open air to both fields.

P.S: To better understand why I wrote NWOGA, please, see here. :)

Update: Also via MEDAL Blogging.

Labels: , , , , , , ,

Charles Darwin Has A Posse Check Google Page Rank