Biological Information and Intelligent Design: Meyer, Yarus, and the Direct Templating Hypothesis

Forgot to mention - thanks for those references for Behe on front loading. Much appreciated. Other than Denton, has anyone in the ID camp developed front loading ideas, that you know of? I know the anonymous Mike Gene did, many years ago, but I’m not aware of others since then (other than Denton).

This post was flagged by the community and is temporarily hidden.

This post was flagged by the community and is temporarily hidden.


Discussion of front-loading is useful as a way of showing that “design” does not entail “miraculous disruptions”. From cosmic fine tuning to the more detailed explorations in the tradition Denton follows (going back at least to Lawrence Joseph Henderson early last century), it is valuable in seeing purpose and teleology in the specifics of lawlike processes (as opposed to the more philsophical point that the existence of lawlike processes at all is evidence of final causation).

But we need to remember that modern science was founded on the twin pillars of God’s reliability in making regular laws, and his freedom in creating cointingency which required empirical observation, not just elicitation of regular principles.

As I’m sure you recognise, to put all ones eggs in the basket of front-loading is either to return to pure deterministic Deism, or to exempt all contingency from the creative order of God. That’s so even if one postulates more esoteric “laws” of emergence: it’s to restrict the living God of Christian theism to the distant clockmaker of the Deists, or of the Gnostics - for whom an incompetent demiurge is actually responsible for the down-and-dirty reality.

Therefore, at some point, whilst retaining Christian theism, one can’t avoid considering contingent divine action in addition to any frontloading - not least because it is very difficult to conceive in the current state of fundamental science how the laws we know could be anything more than permissive of life, rather than prescriptive of the whole created order. Even Denton does not argue that the human race is purely the outcome of frontloading (in fact he specifically limits the powers of the laws of form presented in his recent book).

It’s contingency in which most controversy lies, and therefore the relationship of creation to contingency that deserves the most study by ECs. I suggest that the most fruitful line of approach is the study of chance. Scientifically, this is methodologically limited to the assumption of “randomness”. However, theologically, randomness is a fundamental category error which should be viewed in terms of “providence”, of which it is one aspect in God’s overall governance of the natural order, and that in which his free creative choice is most to be seen, if not always understood.

1 Like


Thanks Dennis for your graciousness, and I apologise for contributing to the derailment of the discussion away from your post. I want to compliment you on a fine job in providing the arguments refuting Nelson and Meyer, and I dont have much to add to that (especially since you quoted my earlier comment on that vein). I am looking forward to your next post on further support for direct templating from other labs, etc.

However, I do have a question regarding the further evolution of the code which I am hoping you will address either here or in a subsequent post in the series. Let’s ssume an early version of the code (if it even was a code then) using a few amino acids and their cognate, (templated) codons. Since evolution as we know it requires the code to allow for efficient, error free translation of genotypic modifications into new fitter phenotypes, how can such a system evolve in the absence of that mechanism? Does this require an entirely new kind of evolutionary mechanism? (The same question is pertinent to the issue of the transition from RNA world to DNA world).

I have a couple of questions of perhaps broader scope than what the article entails.

It seems like, at least thus far, only a subset of amino acid bindings have been shown to have an ancestral chemical origin. Is it required that something similar be found for the other sequences/acid pairings for the Direct Templating Hypothesis to be fully vetted out, or is the remaining subset subject to a random assignment later based on the need for that amino acid in proteins, etc?

What mechanism is there for then being able to transition from having specific chemical bindings to not having them in the current set? I believe last time you showed that function is currently performed by cellular “machines,” but what drove that transition and how did they know to “remember” the specificity obtained from chemical binding?

To put it plainly, if the genetic code had an origin driven in part by chemical binding events, then it is not a “genuine code” in the sense we humans use the word—and further research might reveal plausible scenarios by which the entire code may have come to be. - See more at:


This whole argument exposes the weakness of this whole debate. The issue is whether life and the universe is designed or rationally structured. It really does not make any difference as to whether it is rationally designed by nature or by God, except for reasons of their own non-believers and some believers do not think that nature can rationally design evolution.

Therefore they many believers and most non-believers maintain that if evolution is “natural,” then God did not create it, because it is not rational. However the DNA code is by nature rational, because it is a language with its own words and syntax.

It is erroneously stated that a code is a way to disguise language. That is not true. The most famous “code” is the Morse Code which is the changing of letters to a dot and dash format so language can be transmitted electronically by the telegraph. Also we have coding and decoding by computers, which is does much the same thing.

According to my understanding all languages are codes. Pater is a code word for Father as is Abba. They are both governed the dictionary for each respective language. The genetic code is a language, created by God in nature. We can see that it works. and maybe how it works, but we really do not understand how God started ex nihilo and brought it all the way up to rational human beings.

This is what we observe and know. It is absurd for non-believers to cause confusion and conflict by claiming that hi9s is not true. Nature is designed. Nature is rational. Nature is good, because God created Nature.

Hi Kyle,

We’ll talk a bit about this in upcoming posts, but the short answer is there are hypotheses, but as of yet little evidence in this area. Abiogenesis and the origins of the genetic code are issues we don’t know much about. What we do know is strongly suggestive of a natural origin, but we are far from working it all out (if we ever will). Research in this area is progressing, though.

Thank you for a series of informative and clear posts on this question. On reading Meyer’s “Signature in the Cell” one objection occurred to me that I’ve not seen discussed. Meyer assumes that biological “information” is conserved. I don’t know of any theorem or axiom to that effect. Indeed, if one draws the analogy (inverse) of information to entropy, one can state that entropy is conserved only in an isolated system at equilibrium (undergoing reversible processes). Otherwise the Second Law disposes of conservation of information. I’d welcome comments / criticisms of this argument.

I think you are right, although I dont know of any formal theorem. But, considering that death leads to loss of biological information (not conversion in any sense) it seems clear that conservation is not at all guaranteed. Extinction of a species is an irreversible loss of information, and conservation is clearly disproven by the theoretical possibility of a life ending (sterilization) catastrophe on Earth.

This is what I wrote about Behe’s position, which he states here in this video.

On May 9th (before Hunter’s critique was published), one of the participants in this dialogue asked Michael Behe to weigh in on common descent. Dr. Behe gave a several minute explanation of his position, in his response to Bill Cole, who is commenting frequently in this conversation. Behe’s explanation of his position is really worth listening to if you are a creationist that cares about this debate.

In particular, Dr. Behe points out that the argument he makes for design is entirely separate from common descent. He goes so far as to explain that the design for molecular machines can be injected into the universe by carefully chosen initial conditions for the Big Bang (ie. fine tuning). In this proposal, entirely natural mechanisms (like neutral theory) would correctly (but incompletely) describe our world. He compares evolution to a “trick shot” in billiards, where the initial conditions and skill of the player conspire to make the improbable certain, all while using entirely natural mechanisms.

This post was flagged by the community and is temporarily hidden.

Agreed - but it is here that Behe’s arguments do not pass muster. Behe claims simultaneous mutations are needed to assemble irreducibly complex systems - and that this requirement prevents them apart from design, since simultaneous mutations are too improbable. Yet we have very good evidence - in some cases from direct observation - that simultaneous mutations are not required to build new protein-protein binding sites. So, regardless of Behe’s private views, his biological argument is in error.

1 Like

This post was flagged by the community and is temporarily hidden.

No, but if proteins can be added one by one over time, there is no a priori reason to presuppose that it could not be assembled over time. Simply citing an example of a complex system and saying there is no known pathway to it is insufficient to infer design.

1 Like

So there is some good work showing plausible incremental evolutionary pathways to flagellum. In particular, almost all (40 out 42) of bacterial flagellar proteins are homologous to non flagellar proteins. This is exactly what we predict from evolution.

I should point out that this is one of the major findings (the homologous proteins to flagellar proteins) that directed me out of the ID movement. This is an easy test of the early ID theories, but wasn’t done. As a college students in biology, I was totally befuddled why this basic study was not done.

And in written form in a Nature article:

1 Like

Thanks Josh - I’m busy teaching a lab, so I didn’t have time to pull material like that together.

Eddie, why do you suppose that so many flagellar proteins are homologous to non-flagellar proteins? What ID hypothesis could account for that observation?

The predictive power of the modern evolutionary synthesis is simply unmatched by anything ID has to offer.

Hi Joshua,

I have to be a little jovial after seeing the video - I am not a fan of ID, and yet the first thing that I thought of after viewing the video, was of a mechanic finding spare parts (or all of the parts) for an engine. I say this in a light hearted manner - but seriously, what evidence do you or anyone, produce for such a step-wise assembling of this organism? The message that comes loud and clear, is that so many steps, driven by chance or random events, is so improbable as to be a fairy tale. I prefer to think that the video is a simplification, but I would be fascinated by clear evidence (data) that identifies previous organisms that verify the many steps proposed in this demonstration.

This post was flagged by the community and is temporarily hidden.

“Let your conversation be always full of grace, seasoned with salt, so that you may know how to answer everyone.” -Colossians 4:6

This is a place for gracious dialogue about science and faith. Please read our FAQ/Guidelines before posting.