Need reviewers for Common Design theory to be submitted to Science journal

It must be difficult to you as well because you have yet to tie these quantum effects to DNA sequence in any meaningful manner.

Again, this is nonsense. It is no different than saying invisible pink unicorns produce rainbows and then point to rainbows as evidence for invisible pink unicorns.

No, they don’t. You need evidence for this consciousness, not just quantum effects.

For the first 9 billion years of the universe there was no DNA on Earth, or an Earth at all.

I’m not seeing any calculations for any probabilities.

I challenge that premise. Where is the evidence?

Where is the evidence for this premise?

No, they aren’t. No matter how many times you repeat this it will continue to be false.

What does the 2nd law of thermodynamics have to do with any of this?

4 Likes

And yet, he persisted. ; )

1 Like

What do you mean by “meaningful manner”? What is considered meaningful to you in this case?

On second thought, I believe there may be a misunderstanding regarding the objectives and scope of our article.

Our article challenges two key assumptions of the Extended Modern Synthesis theory, primarily focusing on the latter:

  1. Mutations are a chance process.
  2. All living organisms share a common ancestor.

There are two approaches to teleological explanation: external and internal teleology. External teleology, derived from Plato, posits that purpose in evolution is imposed by a conscious mind. Internal teleology, derived from Aristotle, suggests that intention and purpose are inherent within the evolutionary process, without requiring a conscious agent. Our article aims to highlight how current literature supports the external teleology perspective.

For example, when scientists describe mutations as “random,” they mean that mutations occur without conscious intent—they do not “aim” to fulfill an organism’s needs in a particular environment. Environmental factors can influence the rate but not the direction of mutations. This randomness suggests there is no personal agent selecting adaptive traits during evolution. In addition to questioning this assumption, our article also critiques several aspects of the common descent theory, including the endosymbiosis theory, the artifact hypothesis, and human evolution. Owen’s archetype theory, for instance, fails to explain why nested patterns were chosen for the design process or how these patterns lead to nested hierarchies among vertebrate species and others.

The most controversial aspect of our argument is the separate creation of vertebrate species. However, much of our discussion focuses on whether a conscious agent plays a role in the evolutionary process, which seems to diverge from the objections you raised. Your critiques appear to be more aligned with the assumption of common descent rather than the role of a conscious agent.

This leads me to the next thing you said…

It seems like you are presupposing a different definition of consciousness than our own when you say this. If not, please define what you are presupposing to be consciousness based on what you are saying here.

What is your definition of DNA? Is it a genetic code or just pure chemistry according to you?

For example, researchers can redesign or refactor the genome of bacteriophage T7 to create an engineered surrogate optimized for human purposes, such as resistance to virus infection—a process observed in nature [97]. This showcases how genomes encoding natural biological systems can be systematically redesigned and rebuilt to serve scientific understanding or human intentions [97]. In one instance, scientists synthesized RNA molecules of a virus and reconstructed a poliovirus particle from scratch, without a natural template [97]. This was achieved by utilizing components from another virus, such as specialized proteins (enzymes), to construct an RNA virus capable of addressing the problem of unstable RNA [97]. Upon introducing this synthetic RNA virus into cells, it successfully generated infectious poliovirus particles [97]. The instability of RNA is a well-known challenge in the RNA world hypothesis, and similar solutions have been proposed, such as the Protein-first hypothesis [7].

Furthermore, observations suggest that RNA viruses not only likely preceded the first cells [25, 50] but also played a crucial role in shaping and building the genomes of all species [25,50]. HGT can confer significant advantages to organisms, enabling them to overcome challenges that would otherwise require gradual evolution through mutation and selection [25,50]. This suggests that evolution can be accelerated as a parallel process, wherein innovations originating in different lineages converge in a single cell through HGT [25,50].

In contrast, the concept of a universal common descent among species is not supported due to limitations in natural selection’s ability to explain the transition from non-life to life or to differentiate between non-living and living entities [93]. Particularly, RNA viruses cannot be integrated into the Tree of Life framework because they lack cellular characteristics [50], and no single gene is universally shared among all viruses or viral lineages [50]. Viruses are polyphyletic, having multiple evolutionary origins [50]. Moreover, the presence of horizontally transferred genes in organism genomes can complicate phylogenetic relationships, deviating from the clear vertical inheritance depicted by the Tree of Life [15,25]. This phenomenon blurs the lines of evolutionary descent, as genes from diverse sources may coexist within the genome of a single organism [15,25]. As a result, establishing a singular common ancestry for an organism based solely on its genes becomes challenging [15,25].

Therefore, reconciling the likely natural origin and evolution of viruses with Darwin’s theory of evolution poses challenges, as viruses cannot survive or evolve independently from their hosts through natural selection, nor can they be classified within the Tree of Life [50,93]

Again, I was just primarily restating what the article said. Here is a snippet of the article. Just read the fifth paragraph of this article to find this paragraph:

All of these examples and applications of modularity in biology are inherently tied to the concept of hierarchy, as modules can often be further broken down into a series of nested sub-modules.
Modularity and hierarchy in biological systems: Using gene regulatory networks to understand evolutionary change - ScienceDirect

BTW, modularity is a fundamental principle of software engineering. Now, here are their conclusions:

Conclusions

Complex biological systems are both modular and hierarchical, and GRNs provide a mechanistic framework to understand evolutionary change. Research often examines concepts such as pleiotropy, co-option, modularity, and homology; however, it is impossible to fully understand these ideas without examining the entire hierarchical system. Through our awareness of this hierarchical system, we are better able to understand how it evolved.

W.L. Hatleberg, V.F. Hinman, Modularity and hierarchy in biological systems: Using gene regulatory networks to understand evolutionary change in Current Topics in Developmental Biology (Vol. 141, Ch. 2, pp. 39–73), S.F. Gilbert, Ed. (Academic Press, 2021), 39–73.

When I made the claim, I was basing it on observations and experiments not statistics.

Cancer would fall into this category or law because it reflects design decay [51] and a trade-off between DNA repair and cell survival [51]. Therefore, under these circumstances, the designer would not be held responsible for a genuine design flaw or a cruel design feature.

You misunderstood what I said. I used ChatGPT to check what the contents of papers were beyond the abstract. It is another way to help compensate for not having access to those papers, which are under a paywall.

We could use the Lederbergs’ plate replica experiment as an example. In this example, they started with a single bacterium and built up a population from there. They then tested the progeny against antibiotics, and found that about 1 in every 100 million bacteria were resistant. Not only that, but the mutation that gave rise to antibiotic resistance occurred before the bacteria were ever exposed to antibiotic. This is what we would expect from mutations that are random with respect to fitness.

What would we expect from guided mutations? Why wouldn’t all of the bacteria have this mutation for resistance, and do so when exposed to antibiotics?

We could also look at the inherent bias in mutations due to basic chemistry. Transitions are much more common than transversions. This is due to two chemical properties of the bases. First, bases that are more chemically similar to one another are more likely to be mistaken by the enzymes in DNA replication. Second, CpG’s are methylated which converts a C to a T through deamination. Again, these are chemical reactions. So why would guided mutations follow these biases if it has nothing to do with the chemical structure of the bases but instead about the effect of the mutation itself?

These are just a couple of examples. I could also act why there is a difference in sequence conservation between exons and introns, or why homoplasy in sequence data is not common. If the same consciousness is mutating two different genomes, why not produce the same mutation in many lineages?

No, they don’t. What they are saying is that there is no statistical correlation between the mutations that happen and the mutations the organism needs in a given environment.

So why wouldn’t guided mutations show a correlation between the mutations an organism gets and the mutations an organism needs? Why do we see just one bacteria out of hundreds of millions get a beneficial mutation?

I am presupposing consciousness to be consciousness.

Genetic code is a metaphor unless you also think H2O is the code for water. DNA is just chemistry.

So what? Humans can also make ice. This doesn’t mean that all ice formation in the universe is done through a consciousness.

That doesn’t support your premise.

And the article is wrong.

Nested hierarchies are not. More importantly, we are not talking about nested hierarchies within one genome. We are talking about nested hierarchies between species.

All of which you claim are under conscious guidance. If mutations are guided, then so too are these. Additionally, this consciousness would be able to revert any cancer causing mutations.

2 Likes

Thank you for the clarification. I see two problems with everything you just said. Firstly, it appears that the criteria for evaluating our theory may have shifted…

Initially, the requirement seemed to focus on an analysis of real-world data connecting quantum effects to observed genetic mutations. I’ve provided responses aimed at meeting this initial criterion. You then go with this …

What I am getting at here is that you seem to be the moving the goal post now. If not, please explain why?

Secondly, our theory’s support mechanisms include convergent co-option, horizontal gene transfer (HGT), and non-random mutations. Non-random mutations, while important, play a relatively small role compared to the other mechanisms, especially based on what you considered to be “meaningful”.

I really need to know exactly how you are defining consciousness when you mentioned it in your objection because ,as I said before, our definition of consciousness integrates quantum effects and positions it as a fundamental force interacting with biological processes. Without grasping this foundational definition, it’s difficult to appreciate how our theory aligns with real-world data.

Awwww, as I thought. Keep in mind that Patel’s research proposes that the genetic code’s structure mirrors quantum algorithms, suggesting that DNA replication and protein synthesis may exploit quantum coherence to enhance efficiency and error tolerance. This hypothesis implies that the genetic code is more than a mere chemical system; it may embody sophisticated informational processes akin to quantum computation.

Recent experiments have demonstrated that certain quantum search algorithms, like Grover’s algorithm, can naturally occur in physical systems. For instance, studies have shown that electrons moving across specific crystal surfaces can perform quantum searches, indicating that such algorithms might be indeed inherent properties of nature.

MIT Technology Review

These findings lend credence to Patel’s hypothesis by illustrating that quantum search processes can manifest in natural systems. If such quantum effects are present in biological processes, it supports the idea that the genetic code could utilize quantum mechanisms to optimize its functions. Therefore, rather than undermining Patel’s research, these experimental results provide empirical support for the potential role of quantum algorithms in biological systems, suggesting that the genetic code may indeed be more than just chemistry.

The comparison with ice formation doesn’t align with our model, as our theory considers quantum mechanisms that interact specifically with biological processes in living organisms, not inert matter

I agree, but this does:

Reconciling the likely natural origin and evolution of viruses with Darwin’s theory of evolution poses challenges, as viruses cannot survive or evolve independently from their hosts through natural selection, nor can they be classified within the Tree of Life [50,93].

Then, here is another source that shows that it does. Just read the conclusion of this paper:

Batory, D. ∙ O’Malley, S.

The design and implementation of hierarchical software systems with reusable components

ACM Trans. Softw. Eng. Methodol. 1992; 1 :355-398

tosem-92.pdf

Correct, but what is your point here?

I don’t think I understood this properly. So I decided to backtrack. Can you explain more on what you mean by this?

For instance, are you saying you reject this claim as well:

All of these examples and applications of modularity in biology are inherently tied to the concept of hierarchy, as modules can often be further broken down into a series of nested sub-modules…

Conclusions

Complex biological systems are both modular and hierarchical, and GRNs provide a mechanistic framework to understand evolutionary change. Research often examines concepts such as pleiotropy, co-option, modularity, and homology; however, it is impossible to fully understand these ideas without examining the entire hierarchical system. Through our awareness of this hierarchical system, we are better able to understand how it evolved.

W.L. Hatleberg, V.F. Hinman, Modularity and hierarchy in biological systems: Using gene regulatory networks to understand evolutionary change in Current Topics in Developmental Biology (Vol. 141, Ch. 2, pp. 39–73), S.F. Gilbert, Ed. (Academic Press, 2021), 39–73.

Here is another source showing it does. Just read the conclusion of this paper:

Batory, D. ∙ O’Malley, S.

The design and implementation of hierarchical software systems with reusable components

ACM Trans. Softw. Eng. Methodol. 1992; 1 :355-398

tosem-92.pdf

I’m no biologist but it doesn’t take one to see the trouble here – a lack of comprehension. You think that this:

Is moving the goalposts from this:

When this latter paragraph gives you a firm definition of the phrase you’re wondering about.

T_Aquaticus isn’t moving the goalposts, he’s describing them from different directions to try to get you to engage in some actual science. You know, like things should be:

I don’t have the citation for the paper, but I read one fairly recently that analyzed the progression of evolution and concluded that the emergence of upright, bipedal, tool-using intelligence was inevitable, no guidance needed. That points to where your efforts need to be if you want to be doing science: show where statistically the known mechanisms of biology are insufficient to account for all observed phenomena. Until you can do that, your effort is no more relevant to science than the ancient categories of beauty, goodness, and truth.

I would love to see it discovered that there is consciousness in all life and even in all matter and energy, but your paper is just wishing, not science. It doesn’t even rise to the level of a call for research since it provides no suggestions for testable hypotheses.

And where are your hypotheses for testing this idea? How will you show that “quantum algorithms” provide a better explanation than the current paradigm?

So show statistically and with solid data that the idea that there is consciousness behind viruses fits better than the current paradigm.

  1. There’s nothing in there about nesting.
  2. The hierarchies being discussed are not relations between programs but within programs, as evidenced by the use f the term “layering” to describe them.

So that article doesn’t actually provide support for your proposal.

Nor does the next one. You seem to be assuming that because “hierarchy” and “nested” are used in the same paper that it is saying that programming code corresponds to the evolutionary nested hierarchy, but that is not the case: nothing in either article suggests that the design of different programs can be organized as a nested hierarchy, they are instead speaking of structure within programs.

3 Likes

Bovine faeces. That article doesn’t even mention software engineering.

I did not misunderstand what you said. Your current use of ChatGPT is irrelevant to your history of lying about your sources before you started using ChatGPT, and doubly irrelevant to the cases where you were lying about the contents of sources that are fully available.

It hasn’t shifted.

And I have consistently said that you need to tie this to actual DNA data in some meaningful manner. Simply pointing to quantum effects means nothing. If a consciousness is guiding mutations then you need to show this by the consequences of this guidance which means looking at how genomes change, when they change, how it relates to the environmental pressures, how it relates to the chemistry of DNA, and how it relates to selectable function.

  1. Why do transitions outnumber transversions?
  2. Why don’t all bacteria immediately get a mutation for antibiotic resistance when exposed to antibiotics?
  3. Why is there such a huge difference in the amount of sequence conservation in exons and introns?

They’ve been in the same place the whole time. You have just refused to kick a ball towards the goal posts.

Theories don’t support mechanisms. Evidence does. You need to produce evidence that a consciousness is guiding these mechanisms as you are trying to claim.

If you were conscious of the fact that a mutation would cause cancer would you allow it to happen? Would you refuse to fix that mutation if it did occur?

All of which is chemistry (or physics, if you will).

No more so than any other quantum interaction in the universe.

Quantum mechanisms also exist in inert matter.

“The electronic properties and optical response of ice and water are intricately shaped by their molecular structure, including the quantum mechanical nature of the hydrogen atoms.”
https://pubs.acs.org/doi/10.1021/acs.jpclett.4c01315

That doesn’t support your premise. That’s an argument from ignorance.

That directly contradicts your claims because the modules of that software system can be swapped in and out in violation of a nested hierarchy. You keep ignoring the nested part and what is being compared.

You are claiming that a consciousness is purposefully causing cancer through mutations, correct?

1 Like

@RTBsupporter,

Thinking further about this, the 2005 chimp genome paper might be a good place for you to start. Read through the analysis of the comparison of the chimp and human genomes, and think about how your model addresses those observations.

https://www.nature.com/articles/nature04072

For example:

Just for reference:
Ka = non-synonymous substitution rate
Ks = synonymous substitution rate
Ki = intron substitution rate

Why would indels tend to be just one codon and occur in repetitive sequence?

Why is the ratio of non-synonymous to synonymous mutation rates the same as the ratio of non-synonymous to intron mutation rates?

How does your model explain why the non-synonymous mutation rate is 37% higher at the ends of chromosomes?

2 Likes

Before I address this, I just want to say thank you for crafting a response I can actually address and use to improve our article or establish a dialogue with someone else on this forum where we can understand each other’s viewpoint better.

Yes, what you suggested is what I have been showing (or at least trying to show) over the course of our discussion. Let me give you an overview of what has been presented on here so far…

  1. Introduction to Mechanisms in Our Theory
    Our theory posits that convergent co-option, horizontal gene transfer (HGT), and non-random mutations drive evolutionary innovation. While non-random mutations play a role, the primary mechanisms in our model are convergent co-option and HGT. Evidence from the ENCODE project supports the functionality of these mechanisms by revealing activity within noncoding regions of the genome, traditionally seen as “junk DNA” by Darwinian evolutionists.
  2. ENCODE Project Findings and Functional DNA
    The ENCODE project provides a model that aligns with our theory, particularly with its discovery of functional elements within “junk DNA.” The project identified biochemical activity in 80% of the human genome, exceeding the 51% threshold proposed by Owen’s theory, which suggests a significant proportion of noncoding DNA has functional utility. While we didn’t predict this exact percentage, ENCODE’s findings reinforce the validity of our model, making our theory’s predictions appear even more robust.
  • Mechanisms of Biochemical Activity: ENCODE assessed DNA activity by examining transcription, transcription factor binding, histone binding, DNA methylation, and three-dimensional enhancer-gene interactions. These processes are integral to gene regulation, providing evidence that these sequences have functional roles beyond mere genetic “noise.”
  1. Noncoding RNA and Pseudogene Functionality
    As supported by the competitive endogenous RNA (ceRNA) hypothesis, noncoding RNAs—including pseudogenes—show biological functionality, which aligns with ENCODE’s findings. This model suggests that noncoding RNAs regulate protein synthesis and gene expression, indicating these sequences serve essential functions. Mattick and Dinger have further emphasized the functional role of noncoding RNAs, citing hundreds of cases that show biological relevance in development and disease, which likely extends to other noncoding RNAs across the genome.
  2. Further Evidence Supporting Functionality of “Junk DNA”
    Professor Alistair Forrest and his team at the Harry Perkins Institute of Medical Research have identified numerous long noncoding RNAs (lncRNAs) with potential involvement in disease and genetic traits, suggesting that much of the noncoding genome is functional. Their findings challenge the hypothesis that these lncRNAs are simply byproducts of a noisy transcriptional process.
  3. Critique of Random Noise Hypothesis
    The alternative hypothesis of random noise does not adequately account for the data. Studies show that nonfunctional protein-DNA interactions can interfere with essential processes like transcription, replication, and mutational repair. Qian and Kussell (2016) demonstrated that genomes have evolved mechanisms to reduce weak binding motifs, indicating selective pressure against nonfunctional interactions. This undermines the idea that random noise alone can explain the complexity and organization of the genome.
  4. Protein-Protein Interactions (PPIs) and Cellular Complexity
    Additional research highlights the necessity of precisely regulated protein-protein interactions (PPIs) within cells. PPIs involve carefully structured protein surfaces that facilitate essential interactions while minimizing unwanted ones. Research from Harvard supports this view, revealing that the concentration of PPI-participating proteins is meticulously regulated to maintain cellular function. This level of precision in biochemical interactions suggests an underlying design rather than random, unguided processes.
  5. Supporting Fine-Tuned Functional Networks
    Given the extensive data showing transcription factors bind specific DNA sequences to regulate gene expression, ENCODE’s conclusions appear to offer a probable explanation for genome functionality. While alternative explanations exist, further research can help clarify uncertainties about function in genome binding, which increasingly aligns with the perspective that these networks are fine-tuned, potentially indicative of an intelligent design.

Summary of Our Position
The body of evidence, from ENCODE’s findings to functional studies on noncoding DNA and PPIs, strongly supports our theory’s mechanisms—convergent co-option, HGT, and non-random mutations. This extensive data challenges the sufficiency of random, unguided processes alone, suggesting instead a finely-tuned system. Our position is not to dismiss established evolutionary mechanisms but to highlight areas where these mechanisms may be insufficient and to present additional layers of complexity that point to an underlying design.

I am not sure what you mean here. It has already been tested and confirmed by this experiment:

M. Roget, S. Guillet, P. Arrighi, G. Di Molfetta, Grover Search as a Naturally Occurring Phenomenon. Phys. Rev. Lett. 124, 180501 (2020).

Are you asking me for another way to test and confirm Patel’s hypothesis, which we integrate in our theory?

Based on a synthesis of Owen’s archetype theory and the ENCODE project’s findings, we predict that a substantial proportion of ERVs, potentially exceeding 50%, will eventually be shown to have context-dependent or regulatory functions, challenging the traditional view of ERVs as non-functional ‘junk’ DNA

Although this is not formally peer-reviewed yet, it does indeed support what I suggested before and even more it seems:

Software in the natural world: A computational approach to hierarchical emergence

The reason I suggested you might be moving the goalposts is that you initially stated, “At a minimum, there needs to be analysis of real-world data, which is completely missing from the paper for the claims you are trying to make…” This implies you expected me to provide this analysis, regardless of which definition of function we use to evaluate ENCODE’s results.

However, midway through our discussion, you introduced a new requirement: that only your preferred definition of function is valid if I aim to present data suggesting consciousness guides mutations in a fitness-relevant way. I’ve already provided data within the framework of the causal role definition of function, yet it appears this was dismissed based on the definition shift.

And what is your point here?

I understand. I was referring to the common design theory, which is distinct from the universal common designer theory. It’s similar to Darwin’s framework, where natural selection and common descent are two separate theories. Therefore, your objection doesn’t refute my earlier point, as it’s entirely possible that this designer created organisms exclusively through evolutionary processes.

Let me change the premise then:

Premise 2 posits that unguided processes, based on available observations and experiments, cannot adequately explain the complexity or origin of viruses.

Not quite. The designer cannot create a universe without decaying effects because the second law of thermodynamics is a fundamental principle that applies to all physical systems, including those governed by quantum mechanics, suggesting that this law likely exists in all possible worlds.

Therefore, we propose that the designer edits and limits harmful genetic changes, regulating detrimental mutations in a way that maintains a balance between predator and prey populations. This balance is essential, as an overabundance of either predators or prey could lead to ecosystem collapse.

I don’t recall ever doing this on the BioLogos Forum.

Noncoding regions of the genome were not traditionally seen as junk DNA by biologists. That some noncoding DNA has function has been known since at least the 1950s.

2 Likes

How was this determined? Endogenous retroviruses only make up ~4% of the human genome, and HGT is nearly absent in mammal evolution.

HOW???

You are assuming all of these sites are acting in gene regulation. You have not backed this assumption with any evidence.

No, they don’t. Being transcribed is not function.

How did they determine function? How many functional lncRNAs did they find? What percentage of the genome is made up of functional lncRNAs?

But not eliminate.

The immune system shows otherwise. The antigen binding region of antibodies are assembled with random sequence, and they regularly bind to antigens.

You mean you don’t have that evidence now? Then how can you say they have function?

No, it doesn’t. Find anywhere in that article where its says individual software programs fall into a nested hierarchy.

From the very beginning I have explained why the causal definition of function is nonsense. Here is Graur saying the same thing:

Does your model accept universal common ancestry?

That is an argument from incredulity, and you are inserting your own model into a gap in our knowledge which is an argument from ignorance. You need positive evidence that your mechanisms created viruses, not simply arguments against other explanations.

What does the second law of thermodynamics have to do with the effect of mutations???

Are you saying that thermodynamics prevents neutral or beneficial mutations from being created? If so, HOW???

For example:

AAAAAA --mutation–> AAATAA

So are you saying that if this mutation results in a beneficial change then it would be prevented by thermodynamics, but if it results in a deleterious effect it is being caused by thermodynamics? If so, HOW??

It would seem to me the only place where thermodynamics really comes into play is in the energy required to extend the DNA molecule by another nucleotide.

Natural selection already does this without the need for guidance.

2 Likes

It was elsewhere.

It’s also not clear why a conscious, self-collapsing wave function (or whatever it is) that can do anything has to obey the 2nd Law.

1 Like

@Swilling is a neuroradiologist with connections to Reasons to Believe. I would be curious as to his thoughts on this paper.

You mean the one one originally posted? I can’t wade through all the 175 comments.

“No article found.”

As to your above references, as I’ve said I’m no biologist, so I can’t evaluate a lot of what you’re saying; I let the biologists do that – and it seems to me that so far they have concluded that the points you reiterate are not valid.

On what basis? How do observed changes indicate that there is a designer involved? If this isn’t derived from data, what predictions – solid, statistically significant – will you make that would show such involvement?

It’s been pointed out to you repeatedly.

An analogy popped into my head, from once when I and a couple others had to move a gravel pile about ten feet by shovel:
Being transcribed is equivalent to gravel being scooped up in a shovel; function is equivalent to the gravel actually arriving at the new location. A fair amount of our shoveling didn’t land the gravel in the new pile, so that portion had no function.

Another: being transcribed is equivalent to a part for a circuit board being taken out of a tray; function is when the circuit boar is connected and that part does its job.

Another analogy . . .

light switch on a wall = transcription factor binding site
Flipping the switch = transcription
light turning on = transcript with selectable function

You don’t have function without all three. The simple existence of a switch on the wall is not function. If you flip the switch and nothing happens there is no function. It is only when the light bulb turns on that you have function.

Yes - the “Reboot of Richard Owen’s Common Archetype Theory” of the OP.