Monday, April 20, 2009

Worse Is Better!

David Tow has written an instigating article on the role artificial intelligence (AI) methods are playing and will play concerning its use in e-business.

He outlines a set of major methods he believes will become a hotbed for e-commerce and web intelligence:

  1. Evolutionary algorithms
  2. Bayesian networks
  3. Fuzzy logic
  4. Swarm intelligence
  5. Neural networks
  6. Intelligent agents

Except for #4 and #6, I have seen so many enterprise applications of the remaining methods -- being pretty honest, not only enterprise ones, but a highly diversified spectrum of uses of those methods (#1, #2, #3, and #5).

But what David Tow argues is the increasing use of those methods in e-commerce and web intelligence. I think his arguments are well reasoned, since those kinds of endeavours are recent phenomena -- they arose after the Internet and the World Wide Web --, it is normal AI methods have not been widely applied in them so far.

His future trends prognostics are even more instigating:

"The enterprise of the future will increasingly depend on the wide range of rigorous artificial intelligence algorithms and methods outlined above, to drive its operations at all levels of management. 

Decision Engineering techniques are at the forefront of this revolution- while IBM has recently set up a new services unit focussed on applying predictive modelling to automate business decisions.  

These techniques will continue to be enhanced and packaged in different combinations, to provide immensely powerful problem solving capability as well as integrating with the global intelligence of the Web 4.0.  

Major decisions incorporating sophisticated levels of intelligent problem-solving will increasingly be applied autonomously and within real time constraints, in order to achieve the level of adaptability required to survive in an ever-changing and uncertain global environment.

I do not think traditional methods are sentenced to die out, just take a look at the operations research guys that have bred a shining research city on the hill since the early stages of that field around the 1930s and I do not consider they will suddenly vanish from the surface of this planet.

The author did not point an important issue: Better not always means better. The better definition here is not intented to slur other methods, but it embeds what an enthusiast of some research area thinks about his own field/research: Not so rarely, we can see researchers colourfully speaking when it comes to their own research. I consider that a sensible attitude, but hyper-hyped words do not help to solve problems in the real world and the real world, as an old saying states, is a cruel place that may tear into shreds any hype. The chills of the AI Winter can still be felt.

Other important point has to do with the user base a given method owns. Someone may have designed the best time complexity algorithm, the best neural network, the best fuzzy set tunning method, and so on, but if they do not have a significant user base, all those methods are useless -- they may serve as an interesting academic investigation endeavour, but not for application in the real world, since no one would be using them. The ultimate example that illustrates this is the so called worse is better approach:

[...] [S]omething can be "inferior" but still "better". For example, to a particular market or user, software that is limited but exceptionally simple to use may be "better" than software that is more comprehensive but harder to use.

I hope I am not slurring the work of anyone else out there, but the Elitist Simple Genetic Algorithm is, in my humble and insignificant opinion, a perfect example of worse is better in evolutionary computation.

No, I do not consider the elitist SGA a bad method. On the contrary: It is an amazing method devised by Professor John Holland and it has succeeded in so many fields -- thanks to the embedded knowledge its users have included in it and its simplicity. BUT, the elitist SGA, despite its drawbacks, is the most applied genetic algorithm (and evolutionary algorithm) until now, even though there are some new genetic-based techniques that are direct descendants of the SGA, such as EDAs (Estimation of Distribution Algorithms), that were made to overcome some drawbacks inherent to the SGA. Fifteen years after the publication of Baluja's seminal EDA paper, the SGA still is strong, well alive, and bigger than all of its offsprings. Worse is better! :)

Even the SGA siblings (evolution strategies, evolutionary programming and genetic programming) have not got the wide range SGA has. The SGA dominance and its offsprings struggle to obtain a bigger amount in the application market is, in my opinion, an example of worse is better. Would the SGA be an evolutionarily stable strategy? :)

So, I do not believe future problem solving will be strongly tied to artificial intelligence methods. At least not in a fatalistic manner as some believe. Those methods will play a role -- major or minor -- in different problem solving situations, but not to the point of relying 100% on them.

Labels: , , , , , ,

Thursday, April 16, 2009

Sounding Stars Through Simulated Evolution

NASA's Kepler Mission, intended to find Earth-like planets around stars, will use a parallel genetic algorithm to help that task. Read and watch the complete story here.

Labels: , , , , ,

Charles Darwin Has A Posse Check Google Page Rank