Wednesday, June 28, 2006

Algorithmic Perspectives

One of my favorite blogs is iDesign@UCI. The authors are generally thoughtful, original and analytical. It makes for a good mix. A URL leading to a particular post at the blogsite is provided. My comments are included.


http://tinyurl.com/jrxwr

Friday, June 23, 2006
Compositional Evolution
I'm reading through a new book, Compositional Evolution by Richard Watson, and it is really interesting. Watson's thesis is that certain evolutionary methods of variation (sexual recombination, lateral gene transfer, symbiotic encapsulation, etc.) are fundamentally distinct in an algorithmic sense from the traditional gradualist framework (in which beneficial mutations are accumulated in a linear fashion). Gradualism operates as a hill-climbing search strategy, and is therefore prone to get stuck in local optima. Compositional mechanisms, on the other hand, are apparently able to escape them (in systems with a semi-modular relationship among their variables) by combining pre-adapted genetic data from two distinct lineages.

Watson is a computer scientist, and examines evolution from an algorithmic perspective. The admission and proof of the inability of gradualism to evolve systems with strong dependencies among its variables is refreshingly frank. He presents a substitute mechanism, of course, so the book is not supportive of ID. Still, it helps formalize and highlight the issues involved, in a way that makes them easier to talk about. Computational Evolution is definitely worth a read if you're interested in looking at evolution from an algorithmic perspective – especially if you'd like to critically evaluate the power of evolutionary search strategies.
Posted by Wedge at 8:18 AM

4 Comments:
William Bradford said...
Watson may change the focus by centering it on compositional mechanisms but such mechanisms do not address objections to standard models. Modular models merely move problems with gradualism down a logical chain. Gradualism is inherent to any life origins paradigm absent any assumptions that encoded nucleic acids and functionally sequenced proteins came prepackaged with the begining of the universe. The difficulty for Watson and others is they cannot avoid a gradualist approach to a minimal genome and functional cell. They must assume an intricate level of interacting biological modules before any algorithm becomes plausible. When Watson nails down a module applicable to basic protein synthesis and metabolic functions he will have something worthwhile.
6/26/2006 5:36 PM

Wedge said...
William,
This is absolutely true. I think Watson's results are algorithmically interesting but they don't explain, as you point out, how the compositional operators which he analyzes might have arisen.


[Bradford]: Therein lies the problem for those wanting to exclude an intelligent inference. It is strongest at the point of life's origin where abiogenesis is embarrassingly inadaquate from a naturalist perspective.

0 Comments:

Post a Comment

<< Home