Need reviewers for Common Design theory to be submitted to Science journal

I understand your perspective; however, you did not clearly define your criteria when I initially asked you to lay them out. You began with, “At a minimum, there needs to be analysis of real-world data, which is completely missing from the paper for the claims you are trying to make.” This phrase implies that an exception might be possible.

Now, it appears you’re acknowledging a shift in your criteria while justifying this change in focus. If this isn’t the case, please clarify how this does not represent a shift. Given your initial statement, “At a minimum,” I find it challenging to see how there hasn’t been a shift in your focus.

Until this point is clarified or conceded, I am going to postpone addressing the rest of your comments.

Ok but what is your point here? Or were you just correcting what I said?

The designer cannot create a universe without decay effects, as the second law of thermodynamics is a fundamental principle that applies to all physical systems, including those governed by quantum mechanics [1]. This implies that the second law likely holds across all possible worlds.

Thus, preventing decay entirely would require suspending the second law (i.e., performing a miracle). According to our theory, however, the laws of the physico-chemical world reflect the designer’s consistent nature, so we would not expect such a suspension. Instead, we would expect the designer to work within natural laws to mitigate harmful genetic changes and regulate mutations, preserving ecosystem balance by managing predator-prey populations, as an imbalance could lead to ecological collapse.

In the future, research will seek to verify that alleged “harmful” design features outlined by Rubicondior [73] contribute to a measurable increase in the efficiency and speed of a population or another nephesh basic type animal or human’s survival, reproduction, and adaptation. This prediction implies that features traditionally considered harmful or detrimental actually contribute to the efficiency and speed of survival, reproduction, and adaptation in populations. It contrasts with the notion in Darwinian evolution that natural selection operates through a “cruel” and “mindless” process favoring individual fitness rather than the collective benefit of the group [14,85]. Thus, confirming this prediction would challenge conventional evolutionary explanations by suggesting that seemingly harmful features may play beneficial roles in evolutionary processes [86].

Testing Harmful Design Features:

Experimental Setup: Choose populations of organisms with alleged harmful design features, such as pathogens or features reducing survival or reproductive success.

Data Collection Methods: Measure survival rates, reproductive success, and adaptability in populations with and without the alleged harmful design features.

Statistical Tests: Employ statistical tests, such as chi-square tests or logistic regression, to analyze the differences in survival, reproduction, and adaptation between populations with and without the alleged harmful design features.

Still does not ring a bell

This paper makes no connection to biology. It appears to be a good paper in support for hierarchical design and reusable components in software engineering. I am not disputing these ideas.

I am not qualified to speak to the specific biological mechanisms of virus evolution, but this idea of comparing biological organisms with computer software strikes me as something like the “watchmaker” fallacy. That is, the assumption that because entities share one attribute, they must share all attributes.

If we are talking about a scientific paper here, the watchmaker analogy doesn’t hold water scientifically. In the philosophical realm, I believe there are lots of reasons of qualia to see God’s Hand in nature, but the watchmaker analogy isn’t one of them.

Let me give you a better paper for that then: Software in the natural world: A computational approach to hierarchical emergence | alphaXiv

BTW, this paper is not formally peer-reviewed yet though. Probably because it is fairly new, and it appears to be pretty technical or complex.

I agree but that is not what is argued in the paper. Richard Owen’s theory goes beyond the watchmaker argument. Read this for more:
Archetype or Ancestor? Sir Richard Owen and the Case for Design - Reasons to Believe

You seem to be once again making up ideas and attributing them incorrectly to cited studies. The second law is a fundamental principle in our universe, but the article you cite (which looks kind of fringy to me, but no matter) doesn’t say anything about all possible worlds, just the world where we already know the 2nd law holds.

In any case, you’re missing my point. Yes, the 2nd law holds in our universe. Your self-collapsing wave function looks like it could violate the 2nd law. Therefore, your hypothesized wave function is inconsistent with the physical laws of our universe.

(Stepping back a bit… The conscious self-collapsing wave function that’s being hypothesized here is a notably amorphous concept. As far as I can tell from the brief and vague language, the wave function exists abstractly, it is the basic physical entity encapsulating the entire universe, it’s the cause of consciousness, it is itself conscious, or at least embodies a nonlocal consciousness, it designs organisms, it’s deterministic, and it somehow does something with gravity to execute its designs. The reason I say that it seems to violate the 2nd law is that it must be able to do computation in order to design things, but that computation doesn’t occur in the physical universe or carry the inevitable thermodynamic cost that physical computation does. It sure seems to be a kind of weird, pantheistic god. )

3 Likes

Again, biological inheritance has nothing to do with modularity and hierarchy in technology.

You can have technical hierarchies which are neither nested nor are tree structures. Purposeful hierarchy in technology is often to facilitate independence between layers. The Open System Interconnection (OSI) reference model for computer packet exchange is an example. The application layer is independent of the physical layer, and so forth through the seven basic layers. You can swap entirely different technologies, say optical fiber for copper wire, for one layer and still conform to the communication protocol.

The biological nested hierarchy is a tree structure due to passing of traits by descent. This is contrary to independent assignment of traits.

You have had a number of experienced programmers try to impress this to you - it is not some open question. Your referencing papers about software hierarchy is pointless, but it is probably not so much that the papers are wrong as you are misunderstanding them.

1 Like

I think we’re going in circles now; the request of the OP has been more than satisfied by the very thorough criticism not just of the paper but of defenses offered for it.
DeadHorse

The OP should take the proffered advice and go back to the proverbial drawing board.

6 Likes

He’d have to read the papers in order to misunderstand them.

1 Like

I understand your concern, and I apologize if my reasoning was unclear. My statement—“the second law of thermodynamics is a fundamental principle that applies to all physical systems, including those governed by quantum mechanics”—was intended as a reference-supported claim. I then drew an inference from this principle, speculating that if the second law holds universally, it likely applies to all possible worlds, though this inference was my own, not an assertion by the article.

My point was to underscore that while the second law generally applies, I am exploring whether certain conditions (like those in a self-collapsing wave-function model) might appear to bypass or temporarily mitigate it, without outright violation.

Your concern about potential inconsistency with the second law is appreciated, but I believe there’s a logical gap in the argument as stated. Just because the self-collapsing wave-function might seem to challenge the second law does not necessitate a definite violation or inconsistency. Instead, I propose that this model could hypothetically allow for certain regulatory mechanisms that mitigate the effects of entropy rather than outright suspending the second law.

For instance, rather than violating the second law to prevent decay, we could envision the designer working within the constraints of natural laws, subtly influencing mutations and ecological balance in a way that sustains life without disrupting fundamental thermodynamic principles. This approach would align the wave-function with the second law while allowing for life-preserving regulation.

Yeah, that is pretty much our model in a nutshell.

On that note, let me provide a picture of that model in mathematical terms. FYI, I will be using ChatGPT to convert and organize our model into mathematical terms since this is beyond my expertise:

In this model, the universal self-collapsing wave-function inherently favors life-supporting configurations. In this context, the wave-function is not randomly selecting universes but is instead biased toward configurations where the matter distribution and dark energy balance enable the formation of life-sustaining structures.

Thus, every universe generated by this wave-function:

  1. Has a fixed dark energy value Λ\LambdaΛ,
  2. Collapses into initial conditions conducive to life-supporting structure formation,
  3. Excludes configurations that would lead to lifeless or structurally unstable universes.

This quantum gravity model would then argue that the wave-function’s intrinsic properties direct it to create universes that are all inherently capable of supporting life. This avoids the need for a diverse multiverse of habitable and uninhabitable universes and instead suggests a deterministic or highly selective mechanism behind the emergence of life-supporting universes. I hope this makes things even clearer.

It seems there might be some misunderstanding, so I’ll clarify the logical structure of the argument to address your specific critiques:

  • Premise 1: Effects observed in past events should be explained by causes known from uniform experience to produce similar effects, per Lyell’s principle of causation, supported by Darwin.
  • Supporting Evidence: Humans can design and create viruses, showing that intelligent agents can produce complex viral structures.
  • Premise 2: Unguided processes, according to observations and experiments, cannot adequately explain the likely origin or complexity of viruses.
  • Conclusion (Inductive): It is likely that viruses were designed by a universal common designer, as this aligns with the principle of causation from known, observable effects.
  • Premise 3: In software engineering, nested hierarchies are used to manage complexity through design goals such as ease of maintenance, scalability, and adaptability.
  • Premise 4: Viruses exhibit nested hierarchies in their structural organization, similar to those in complex, engineered systems.
  • Conclusion (Supporting): If biological systems are designed with similar goals of efficiency and adaptability, nested hierarchies would naturally emerge as a result of intentional design, further supporting the inference of a designer for viruses.

The article I referenced (arXiv:2402.09090) supports Premise 3 by discussing how hierarchical organization emerges in engineered systems for efficiency and complexity management. If you disagree with this premise specifically, I’d appreciate clarification on whether you’re suggesting:

  • The article does not support the use of nested hierarchies in complex, designed systems, or
  • You find the analogy between nested hierarchies in viruses and engineered systems untenable.

Understanding your objection more precisely would help me address it more effectively.

1 Like

It appears to be beyond ChatGPT’s expertise too.

Though it’s possible I’m misinterpreting what the some of the symbols in it mean.

What quantities do a, c, k, G, ρ and Λ represent?

1 Like

There’s been no shift. From the very start I am expecting to see how your mechanisms play themselves out in real genomes in real species. This is necessarily going to require comparative genomics, either within a species or between species. If you are going to say mutations are not random then it is blindingly obvious that you will need to address the very first experiments that provided evidence for random mutations, such as the Lederbergs’ plate replica experiment and Luria and Delbruck’s fluctuation assay:

1 Like

You have yet to support this premise.

There are no nested hierarchies of separate software. Your premise fails.

You also have not supported this premise.

Where does the article discuss a nested hierarchy of separate computer programs? If I compared several video games would you be able to show how a comparison of their programs produces a nested hierarchy?

That’s not a model. It’s a loose collection of inconsistent ideas couched in vaguely scientific language.

What’s your goal here, @RTBsupporter? You’ve been told by pretty much everyone here, including those with more relevant expertise in crucial areas than the authors of this paper, that the paper is fatally flawed. And flawed not in a “Here’s your mistake” kind of way, but in a “Not even wrong” way. When you argue in its defense, you just end up revealing more flaws. What are you now trying to accomplish?

1 Like

I fail to see how you are even applying thermodynamics to DNA replication. Have you calculated the difference in entropy for a faithfully copied piece of DNA versus a mutated copy?

From what I have seen thus far, you have already given the farm away. You admit that unguided mutations occur spontaneously, so now you need to describe how to differentiate between unguided and guided mutations. Next, you claim that an intelligence is guiding inheritance in some way to protect against deleterious mutations, but natural selection already does this. I can only conclude that the standard mechanisms described in the theory of evolution are sufficient for what we observe.

How is a mutation “decay”? Until you tie mutations into thermodynamics in a mathematical sense I don’t see how your claims make any sense. Thermodynamics is not about what we would subjectively determine to be decay. Thermodynamics is about the movement of work and energy in a system.

Also, there are a whole host of cellular processes that reduce entropy, but all that is required is an input of energy to drive that loss of entropy. In fact, DNA repair mechanisms get rid of many mutations, and no one thinks this is a violation of thermodynamics because it is coupled to other chemical reactions that produce free energy (e.g. ATP —> ADP/AMP).

This question pertains more to the theory of common design rather than our focus on the common designer theory. Our primary aim is to examine whether evolution was a guided process. Comparative genomics may be relevant later, but we first need to address the foundation of our model and the hypothesis of a guided evolutionary process.

We are not arguing that evolution itself is non-random but rather that it is guided by a conscious agent, and that this guidance can be detected. This perspective is distinct from the claim that mutations are inherently non-random, and the classic experiments you reference, while foundational, do not address our hypothesis. Additionally, these studies are over 50 years old and do not negate ENCODE’s findings, which support functionality in non-coding regions.

Our premise is supported by the limitations of Darwin’s theory, which explains the processes after life’s origin but does not address the complexity of viral origins. Given substantial evidence for the virus-first hypothesis, Owen’s theory provides a viable framework, much like string theory is valued over loop quantum gravity for its explanatory power despite challenges in empirical testing.

Owen’s theory offers the best available explanation for viral complexity and origins. While direct evidence may be challenging to obtain at the moment, the theoretical basis remains compelling given the gaps in Darwinian explanations for viral origins.

The reference to nested hierarchies in software was intended to illustrate Premise 3: in software engineering, nested hierarchies are used to manage complexity for scalability, maintenance, and adaptability. This does not claim that software programs form biological-like nested hierarchies but rather draws an analogy to design principles that may also apply in biology.

Recent research suggests otherwise. For instance, the protein p53 activates endogenous retroviruses (ERVs) to induce immune responses against tumors disguised as normal cells, a process known as viral mimicry. This mechanism could explain the presence of viral-like sequences in genomes as a deliberate design for immune function, supporting the idea of purposeful genetic elements.

Pharmacologic Activation of p53 Triggers Viral Mimicry Response Thereby Abolishing Tumor Immune Evasion and Promoting Antitumor Immunity | Cancer Discovery

This was determined based on observations and experiments that show how the likely natural origin and design of viruses appear to mirror the artificial synthesis and design of viruses.

For instance, Owen’s theory, which intertwined the origins of species with those of life itself, proposed that natural selection alone could not fully explain the intricate processes underlying life’s emergence [55,63]. Viruses, in this context, could represent contemporary manifestations of the polarizing and adaptive force, which establish the universal common archetype. They embody fundamental structural blueprints or archetypes, allowing interaction with host cells through convergent co-option. This process, observed in the shared ERV gene repurposed across distinct mammalian orders, exemplifies convergent co-option [82]. Furthermore, non-random mutations and

HGT could be interpreted as mechanisms driven by this adaptive force, enabling viruses to adapt to environmental pressures. Similar to Owen’s theory, viruses evolve through a combination of inherent structural characteristics (polarizing force) and adaptive modifications (adaptive force).

BTW, All three mechanisms have been shown to produce nested hierarchies. Do you want me to show you the studies again?

Given the extensive data showing transcription factors bind specific DNA sequences regulating gene expression, ENCODE’s conclusion remains a probable explanation. Though alternative theories exist, further experiments can narrow uncertainty around function in genome binding, reinforcing ENCODE’s perspective.

While transcription alone does not constitute function by the selection-effect definition, as Manolis Kellis et al. have argued, different scientific disciplines use various definitions of function. Genetic, evolutionary, and biochemical approaches each provide complementary evidence, and transcription may indicate biological relevance in a broader context.

Defining functional DNA elements in the human genome | PNAS

Achieving zero non-specific binding is likely impossible due to inherent interactions, but experimental conditions can significantly reduce it, enhancing specificity and sensitivity. Some non-specific binding may even yield insights into molecular selectivity, so the goal is optimization, not elimination.

Not exactly. Our model does not assume universal common ancestry but may appear as a variant of common descent to advocates of that view. Our approach represents a common design perspective, focusing on a guided process rather than purely ancestral lineage.

We address this within the causal role definition of function. Based on Owen’s archetype theory and ENCODE findings, we predict that over 50% of ERVs will demonstrate context-dependent regulatory functions, challenging the notion of ERVs as non-functional. Are you seeking specific experimental methods to test this hypothesis?

We just interpret the same evidence differently. Natural selection addresses post-creation processes, while Owen’s natural law model explains both inanimate and animate structures. I can elaborate on this again if needed.

Good point. “Violation” may not be the best term. Rather, we see these processes as reducing disorder, which I described as “decay” in a non-technical sense.

These symbols are standard in cosmology, particularly in the Friedmann equation, which models the universe’s expansion.

I appreciate the feedback. I’m open to refining the model to enhance clarity and consistency if that would make it more rigorous.

Well, I already made my intentions very clear on this forum. I specifically said in the OP that
I came on here to put everybody on notice about what we are doing in order to garner and help in attracting the right reviewers for our article.

This is because we have found it challenging to identify a suitable group of reviewers capable of providing the thorough and rigorous evaluation necessary to ensure that the work receives the credibility it deserves within the scientific community. While some researchers have expressed interest, many potential reviewers are either unfamiliar with the interdisciplinary nature of the content or unable to offer the detailed assessment needed to validate the article’s unconventional propositions.

Now, I also said you could ask me some follow questions on this forum if you wanted to, but I was not expecting our discussions to turn into what it has become now because I get that this is a challenging field to review or find reviewers with both the interdisciplinary expertise and openness to unconventional hypotheses.

This is why I was reluctant to go out my way to copy and paste large excerpts for @T_aquaticus because it was going beyond what I originally set out to do on this forum. I did not anticipate the extended discussion here with him but have mainly engaged to clarify points raised.