On the Analogy Between DNA and Language

Can someone explain more precise the difference between function and meaning? It is not clear for me. Why does a code not contain information? It has meaning for us. We can read it and draw conclusions from it

Today I studied campylobacter hepaticus: gene loss. I studied earlier Mycoplasma: gene loss. Evolution of Bordetella bronchiseptica to B. Pertussis and parapertussis: gene loss.

Yeah, I think you’re right about that. That’s the imperfect bit about the analogy between evolution and iterative innovation: innovation is intelligently designed.

3 Likes

Turf13: gene gain.

Remember, your claim was that there was a physical principle that prevented the acquisition of new information at all. Have you dropped that claim?

I think it’s kind of a wash when it comes to one-off analysis code (which is the bulk of what I write, although I’ve also worked on large systems). Yes, they’re usually small programs that don’t keep growing for many years. On the other hand, they frequently don’t have clearly defined requirements because when you’re doing research, you often don’t know what you’re looking for or unexpected things you’ll have to account for. And there’s much less impetus toward cleaning up your code because you know it’s not a long-term product.

2 Likes

We have just been through a devastating two years where acquisition of biological information was written in headlines from the popular press to scientific journals. This debate is done, it is over. Information arises spontaneously in biological systems.

1 Like

Well, I think that it simply is an illusion. And with time this illusion will disappear. There is a strong relationship between information and entropy. On microscale accidentally, there can be phenomena that seems to show a decrease in entropy while on meso/ macro scale, there is an increase. The same is the case with information. The whole pattern is a decrease while on microscale there can be small temporal changes that could be explained as increase. Noise that give the illusion of signal.

Life itself is a local decrease in entropy. I don’t see why this couldn’t be extended to individual parts within each cell.

Just because there is the desire to employ information and entropy as some sort of anti-evolution rhetoric, does not make the well documented reality genetic adaptation of SARS-CoV-2 an illusion. That the virus underwent dozens of identified mutations which functioned to increase transmission and evade immune response, was documented in real time. This also happens with influenza and other viral disease. A similar process is in play with resistant bacteria.

It is not just that it is possible that organisms gain information of their environment and adaptations are selected, but that such information exchange is unceasing. Information gain is relentless in nature. The hand wringing of “oh, biological information can only be lost”, is not just wrong, but not even close or pointed in the right direction. The repository of life information is never static, but always expanding.

Of course we all have seen the evolution of Sars cov 19. Where do you see an increase in information?

In his book The Logic of Chance, Eugene Koonin writes in page 141: it is critical to realize that a sufficient level of HGT (horizontal gene transfer) is essential for the long-term survival of any asexual prokaryotic population; otherwise, such a population is extinguished by mutational meltdown. Thus, a sufficient rate of HGT is a condition sine qua non for the continuous survival and evolution of the prokaryotic world. And Koonin is an outstanding scientist and hard core atheist.

The original species jump adapting to the novel host ACE-2 receptor. Each variant thereafter. You have the prior organism and the now the adapted organisms as well. That is an increase in biological information; function, bit count, however you want to define it.

I’m afraid Koonin is off the mark here – you’ll note that he provides no citation for this particular statement. Plenty of work (mostly theoretical but some experimental) has been done on mutational meltdown (see Muller’s Ratchet). The possibility of mutational meltdown does indeed limit the size of the genome of various organisms, depending on the rate at which they accumulate mutations, but bacterial genomes are typically well under that limit. You will note that RNA viruses (e.g. SARS-CoV-2), with much higher mutation rates, skate much closer to the edge of mutational meltdown, often with little or no exchange of genes, and yet they have no problem retaining highly fit genomes indefinitely.

Which is not to say that horizontal gene exchange is not very useful for bacteria and critical for the survival of small populations. Which is probably why it’s ubiquitous among bacteria.

Yeah, well, I’m a pretty good scientist myself even if I’m not an atheist. What matters is not the reputation of the scientist but how well supported their claims are.

4 Likes

How would you determine if any of the mutations were an increase in information?

What about the differences between the chimp and human genomes? Are any of those differences an increase in information in either lineage? How do you make these determinations?

Well, I would say. With regards to DNA: The level of information is the level of complexity that is needed to successfully go the circle of life. And the level of complexity is defined as the chance of that needed sequence compared to the complete sequence space of the genome of the same size. I made this definition now. Suggestions for improvement are welcome.

I’m not seeing anything that can be quantitative. How do we measure complexity? How do we measure success? How do we quantify the complete sequence space? It reads like a bunch of buzz words that don’t mean anything.

In biology, success is that you can reproduce with an index higher or equal to 1. And I gave a definition of the level of complexity, that can be used to calculate the level of information. It’s simple, every single obliged specific nucleotide that is necessary for function increases the information with a factor of 4. You can use that to calculate the information of the primers that you use for PCR.

Then by that calculation there was an increase in information in SARS-CoV-2 because new mutations became necessary for replication due to the need for immune evasion. Mutations that confer antibiotic resistance are an increase in information. There are many, many examples we can find where there was an increase in information by that measure.

2 Likes

Fascinating discussion. Sorry I’m late to the party. Just commenting on some odds and ends.

I know next to nothing about computer code and only a little more than that about DNA, but I do know a bit about language. I couldn’t say which one is a better analogy for DNA, but I’d definitely disagree that DNA has nothing at all in comparison to language. All metaphors eventually run up against their limits, but they’re simply tools to help us wrap our minds around complex concepts. (Translation: neither code nor language is a perfect analogy for DNA.)

On the whole I agree with your OP. Lots of stuff to chew on. The last bit is certainly correct: Word meaning and grammar (usage) is determined by “general agreement” (Wittgenstein’s term) among a population of speakers. For example, I could make up a new word every day, but if no one else picked up on it and used it similarly, my invention was meaningless. The same applies to genetic mutations. Unless they become fixed in a population, individual genetic variations don’t really matter.

The same applies to language. It also evolves, although via cultural rather than biological evolution. (Defunct pathways and patches in particular make me think of the Bible and the difficulty of interpreting ancient documents.)

Yes, of course DNA doesn’t express abstraction. That’s one place where the analogy breaks down.

On the rest, primatologist and developmental psychologist Michael Tomasello credits the inborn desire to share our thoughts with another for supplying the evolutionary motivation to speak. But a lot of brain, language and cultural development had to occur before that instinct was able to be fully realized. Along the way, human communication had to progress from gestures to single words to two- and three-word combinations lacking grammar. Even though the particular combinations of gesture-sound may have been arbitrarily agreed upon as representing something, that’s not yet “modern” language, let alone abstraction.

“Embodied cognition” led to “cognitive linguistics,” which @Christy can explain much better than I can, but the upshot is that the earliest words and gestures (both in evolution and childhood language acquisition) are related to concrete things and actions. The challenge to language evolution is explaining how terms for abstract (disembodied) concepts arose.

https://psycnet.apa.org/record/2017-01856-001
image

Language is similar. Things change – the environment, technologies, competition for resources. Language changes in response.