A question for the Fine-tuning argument

It isn’t a trigger word. It just happens to have a specific meaning instead of being anything I want it to be at that moment. I also don’t see how the cause of things like radioactive decay or quantum effects to be outside of nature or science. That seems to be rock solid in the center of nature and science.[quote=“Mervin_Bitikofer, post:60, topic:36932”]
Has this actually been observed? Or is it just hypothesized? I have no idea what the “Casimir effect” is and so can’t speak to that specifically. I do know generally, though, that declaring there to be no cause where none has been demonstrated is not the game an empiricist should be eager to play. It is like me declaring that anything we haven’t seen cannot exist. That is quite an unimpressive leap of faith on your part if you wish to go there. I like to have a bit more evidence for my leaps of faith.
[/quote]

And once again we see an attempt at a false equivalency by pretending that other people have faith when they don’t. Virtual particles popping in and out of existence with no cause is just part of our reality in the same way that gravity and electromagnetism are properties of our reality. It is also what fuels Hawking radiation around black holes and other singularities, and some people have claimed to have produced Hawking radiation in the lab:

These are observations. No faith needed.[quote=“Mervin_Bitikofer, post:60, topic:36932”]
It is a non-faith statement because it isn’t really a statement at all --at least not an informative one. Saying things are random because you can produce a normal distribution is the same as saying that things are the way they are. It tells us nothing about why they are that way. So it is tautology.
[/quote]

If making tentative conclusions based on observations is a tautology, then all of science is a tautology.[quote=“Mervin_Bitikofer, post:60, topic:36932”]
I disagree with your first statement above only if you pushed it as an absolute – which is why I absolutely appreciate and agree with your second one as the much needed, qualifying, caveat. “Random” is a good, indisputably useful placeholder word for us in our everyday practices and perceptions, but I don’t see that as evidence that the concept goes deeper into the very roots of phenomena themselves. Perhaps it is --and if so, I’m in awe of whatever impenetrable mystery would be behind such an opaque wall. If not, well then at least one thing about physics finally matches my intuitive comprehension.
[/quote]

I have said multiple times that I am making tentative conclusions which is the just another way of saying that I am not making claims of absolute truth.

So it doesn’t take faith to make tentative conclusions, contrary to what you have been trying to label me with in previous posts.

2 Likes

The fine tuning argument is sort of like the claim that God answers every prayer request because “no” is an answer. We have no way to access the “no” answers.

@T_aquaticus

Okay, let’s examine this. Yes, it is true that molecules do move and they move “randomly” because they contain energy. If they did not move they would have to be at absolute zero. Energy or heat makes molecules move. That is not a process, that is a rest state of nature, since energy and mass are interdependent, however mass at a state of rest does lose energy.

However changes in temperature are not random, and this is what I am talking about. When I light a fire in the morning, I apply heat to paper and wood so that they combine with oxygen, catch fire and produce heat. Heat is produced by oxidation which changes molecules and releases energy and energy makes other molecules move faster.

Thermodynamics is cause and effect, even the Second Law.

1 Like

It may be “in the center of science” in the sense that we’ve been speaking of these things for many decades now and they’ve become familiar terrain along our long-developed highways of scientific discourse. But in another sense I think radioactive decay might still be at the edge of science. It hinges on this question: Has anybody explained why a particular particle decays when it does? My understanding when I last read of this was that this is still fairly mysterious. Perhaps some tentative mechanisms have been proposed in recent years. None of that escapes this logical choice: either a particular particle decay is caused by something (hence not truly random) or it is truly random (without cause that could be scientifically observed). The latter case places this phenomenon not in the center of science, but at the boundary of it rather.

You can bring up all the latest phenomena and names you want – none of it serves to illuminate (to me) how you see it as evidence of inherent randomness. If you actually thought that any of these phenomena actually did signal evidence towards some actual randomness, then far from being in the center of science, you have found something at the edge of science (a true science-stopper like none ever seen before) which in itself would still be an amazing (to me) discovery, classical QM and uncertainty principles notwithstanding.

That may actually be a more deeply profound truth than you might have intended. After all what else can science do but go along with the conviction that “what is, is”. It’s only when we try to pretend this postulate is a theorem that we get ourselves into trouble.

Perhaps not. But one does end up living by one tentative assumption or another in some important cases. That looks like faith to me.

[added edit: when I said above that: “After all what else can science do but go along with the conviction that “what is, is”. It’s only when we try to pretend this postulate is a theorem that we get ourselves into trouble.” … I will hasten to add that I’m not questioning this ‘postulate’ in the least. I strive to live solidly by it myself. I only point out what should be obvious: that my sincere conviction in this does not constitute proof.]

@T_aquaticus,

Radioactive decay is part of the natural fact that everything changes. This is a part of the universe but does not make the universe work. The same thing with quantum effects.

Science has to a large extent been based on reductionism, which means that the basis for understanding of the universe is to understand the smallest part or parts. It is quite evident today that this is not the right approach. The random motion of molecules tells us nothing about everyday reality, accept energy and mass are interdependent.

The universe began with the Big Bang, which set the whole process that we call Reality in motion. This process has little or nothing to do with the random motion of individual molecules. There is little evidence that quantum effects have much effect on individual molecules.

It is the interdependence of the universe which creates the universe what it is, not the activity of individual atoms and molecules. It is the interdependence of Reality which is evidence of rationality of the universe, and the fact that it has meaning and purpose.

Just to be pedantic, absolute zero does not mean the absence of all motion. You can’t have a complete absence of motion without violating the uncertainty principle. Absolute zero means that everything is in its ground state, not that everything is motionless.

1 Like

David,

It seems that the Wiki agrees with my view. I do not think that the uncertainty principle applies here.

At temperatures near 0 K (−273.15 °C; −459.67 °F), nearly all molecular motion ceases from the Wikipedia article on Absolute Zero

It’s not important and I don’t wish to belabor the point, and I’m getting scolded by the biologos purity/morality bot, but you did notice the qualifier nearly, I assume. The wikipedia article on absolute zero is correctly explicit that all mostion does not cease. Here is little discussion on a physics site.

1 Like

You said that random processes can not produce natural laws, and temperature combined with things like thermodynamics and ideal gas laws demonstrate that random processes do underlie natural laws.

On top of that, changes in temperature are governed by random processes as well. Transfer of heat from one object to another requires random movements that run into each other which imparts kinetic energy. Transfer of energy by photons is governed by random events surrounding absorption and emission of photons which closely mimics the double slit experiment.[quote=“Relates, post:63, topic:36932”]
When I light a fire in the morning, I apply heat to paper and wood so that they combine with oxygen, catch fire and produce heat. Heat is produced by oxidation which changes molecules and releases energy and energy makes other molecules move faster.
[/quote]

That heat is the result of random interactions between molecules. The emission of IR photons, which you feel as heat, is a random process.[quote=“Relates, post:63, topic:36932”]
Thermodynamics is cause and effect, even the Second Law.
[/quote]

Thermodynamics doesn’t work without randomness. You don’t see higher energy molecules spontaneously move to one side of a container and lower energy molecules move to the other side of the container. Instead, these molecules bang into each other randomly and disperse energy through the system which we describe as in increase in entropy.

Again, I don’t understand why you think having a cause prevents a process from being random. Perhaps you could explain the logic that undergirds this idea. Using radioactive decay as our example, I could set up a random number generator that uses radioactive decay as the cause of number generation. Those numbers would be random, or at least be consistent with every model we have for randomness.

As to radioactive decay itself, it is caused by particles using quantum tunneling to escape an unstable nucleus. I could be wrong, but I think @glipsnort knows way more about this than I do. From memory, we can produce farily accurate estimates for decay rates from first principles, and the quantum tunneling process is random with respect to which nucleus decays next and when it does so. They decay because there is a non-zero probability that they will.[quote=“Mervin_Bitikofer, post:64, topic:36932”]
That may actually be a more deeply profound truth than you might have intended. After all what else can science do but go along with the conviction that “what is, is”. It’s only when we try to pretend this postulate is a theorem that we get ourselves into trouble.
[/quote]

Any epistemology requires axioms, and one of the axioms for the epistemology science is that the world we see is real. If you think that accepting the world around us as being real somehow indicates a faith based belief then I think you have stretched the term “faith” beyond the breaking point. It no longer has meaning. It seems that the only reason you are trying to make “faith” mean anything and everything is to create some false equivalency between our positions.[quote=“Mervin_Bitikofer, post:64, topic:36932”]
Perhaps not. But one does end up living by one tentative assumption or another in some important cases. That looks like faith to me.
[/quote]

What doesn’t look like faith to you?

Good post.

Of course, fine-tuning does not imply the need for a “fine-tuner”. To physicists, ‘fine-tuning’ is indicative that there is a certain sensitivity of an outcome to some input parameters or assumptions.

For instance, if an experiment produces a unique outcome only as a result of a very defined or precise set-up, then the experiment is said to be “fine-tuned” as regards that outcome. (I would obviously go a bit further than this and, noting the standard definition, argue fine-tuning further implies a contrast between a wide range of possibilities and the narrow range of the particular outcome in question).

‘Fine-tuning for life’ is a bit of misnomer: I would prefer to use the term “fine tuning for complexity/complex chemistry” or the “complex chemistry principle” rather than the anthropic principle…but whatever terminology one invokes, it does refer to a particular kind of physics fine-tuning, where the outcome is the “complex chemistry” necessary for life and by extension “conscious observers” like ourselves (that’s where the “anthropic” element really comes in).

Personally, I am inclined to think that if we really cannot calculate the cosmological constant etc. working from first principles (as seems to be the case, unless some underlying mechanism is discovered to explain its knife-edge nonzero value and I’d be very impressed if some genius did manage to convincingly do this), then philosophy is our best and perhaps only avenue for making sense of the fine-tuning dilemma.

Which is to say, I’m not too sure that fine-tuning, while undoubtedly being a scientific problem, necessarily has a scientific answer. It may be the first such case in history, where science just cannot decisively solve the puzzle.

The anthropic argument, the multiverse, God etc. are all philosophical notions that take one outside the bounds of testable science. Perhaps there is an underlying physical mechanism we just have no knowledge of yet but the way in which some of the greatest minds in theoretical physics feel compelled to make recourse to anthropic/multiverse philosophizing, rather than return to actually trying to solve the “puzzle” from first principles, might suggest that it will not have a conventional explanation. Who knows.

The reason I dislike the inflationary multiverse paradigm is because it just seems like an excuse for failure - to drag philosophical speculation, however empirically based and explanatory or elegant in nature it might be, into scientific discussion. It amounts to this sort of reasoning,

“Oh well, our ability to test high energy physics is approaching its limits: we can’t seem to calculate the observed value of the cosmological constant and supersymmetry may not be found at the LHC to explain the mass of the Higgs, so yeah - the multiverse must have did it! And even if we can’t actually test it because of the particle horizon etc. that’s no big deal, we should just accept it as the explanation anyway because it accounts for the data and seems to be the only game in town”.

Is that how scientific inquiry worked out in the past? From the erroneous Aristotelian understanding of motion, the Geocentric model of the universe to Fred Hoyle’s Steady State theory, science has ultimately lived or died on the basis of testable predictions. Each of these models was ultimately falsified by tested theories (i.e. our modern concept of momentum/impetus, heliocentrism, the cosmic microwave background validating the Big Bang).

Perhaps it is our very starting assumption that everything must of necessity be comprehensible in terms of equations and physics, that nothing lies outside the domain or scope of scientific inquiry, which has placed us in this “fix” in the first place.

2 Likes

The Wiki says that near Absolute Zero nearly all motion ceases. The clear implication is that at Absolute Zero all motion ceases. Since I expect that Absolute Zero has never been reached, we really cannot say for sure.

Nota bene: People on the physics site had different opinions.

The situation is different for a free particle. In that case, at absolute zero the momentum is zero but then we have no knowledge about where the particle is (i.e. \Delta x = \infty). If we want to measure where the particle is we have to put some energy in, but then of course the system is no longer at absolute zero and the momentum is now non-zero.

This one agreed with me and I think is the best.

Hi David, would you mind clarifying…? you write “They argue… that the values of the constants are vanishingly small.” But they don’t. Did you mean “They argue… that the probabilities of the (range of) value(s) that the constants take are vanishingly small”? But vanishingly small probabilities on constant values don’t imply a multiverse unless you adhere to a remarkably literal frequentist notion of probability…

1 Like

An interesting analogy I ran across was Steven Weinberg’s “Earthprime”.

For an analogy, suppose that there is a planet called Earthprime, in every respect identical to our own, except that on this planet mankind developed the science of physics without knowing anything about astronomy. (E.g., one might imagine that Earthprime’s surface is perpetually covered by clouds.) Just as on earth, students on Earthprime would find tables of fundamental constants at the back of their physics textbooks. These tables would list the speed of light, the mass of the electron, and so on, and also another “fundamental” constant having the value 1.99 calories of energy per minute per square centimeter, which gives the energy reaching Earthprime’s surface from some unknown source outside. On earth this is called the solar constant because we know that this energy comes from the sun, but no one on Earthprime would have any way of knowing where this energy comes from or why this constant takes this particular value. Some physicist on Earthprime might note that the observed value of this constant is remarkably well suited to the appearance of life. If Earthprime received much more or much less than 2 calories per minute per square centimeter the water of the oceans would instead be vapor or ice, leaving Earthprime with no liquid water or reasonable substitute in which life could have evolved. The physicist might conclude that this constant of 1.99 calories per minute per square centimeter had been finetuned by God for man’s benefit. More skeptical physicists on Earthprime might argue that such constants are eventually going to be explained by the final laws of physics, and that it is just a lucky accident that they have values favorable for life. In fact, both would be wrong. When the inhabitants of Earthprime finally develop a knowledge of astronomy, they learn that their planet receives 1.99 calories per minute per square centimeter because, like earth, it happens to be about 93 million miles away from a sun that produces 5,600 million million million million calories per minute, but they also see that there are other planets closer to their sun that are too hot for life and more planets farther from their sun that are too cold for life and doubtless countless other planets orbiting other stars of which only a small proportion are suitable for life. When they learn something about astronomy, the arguing physicists on Earthprime finally understand that the reason why they live on a world that receives roughly 2 calories per minute per square centimeter is just that there is no other kind of world where they could live. We in our part of the universe may be like the inhabitants of Earthprime before they learn about astronomy, but with other parts of the universe instead of other planets hidden from our view.(Weinberg, S., “Dreams of a Final Theory,” Pantheon: New York NY, 1992, pp.252-253. Emphasis original)

The scientists on Earthprime could not initially predict the temperature constant on their planet from first principles, but that was because there were many possible temperature constants for planets that were possible. The same could be true for universes with respect to the cosmological constant.

Yes and no. Molecules are in constant motion because they contain energy. That motion which is random helps distribute die and molecules throughout an area of water. However heat is not distributed in a random (haphazard or uneven) manner, which the4 law of thermodynamics. Heat is distributed evenly through the volume of water. Warmer water loses heat and cooler water gains heat and this is what is not random or is based on a probability of 1.

When I light a match it is not a random action. The flame is not a random interaction between molecules, it is a specific interaction between heat, oxygen, and carbohydrates. Heat produces more heat and not randomly, but specifically a cause and effect.

Doug,

Yes, as you wrote, i meant “that the probabilities of the (range of) value(s) that the constants take are vanishingly small”. I also would not say they imply a multiverse. To clarify, those multiverses (such as the String landscape) that predict essentially a semi-infinite number of universes each with a random draw of the constants are inherently (or perhaps trivially) consistent with a fine-tuned universe with uber-fortunate constants.

1 Like

The even distribution of heat is due to randomness. A non-random process would not evenly distribute heat but rather concentrate heat in one area.[quote=“Relates, post:75, topic:36932”]
The flame is not a random interaction between molecules, it is a specific interaction between heat, oxygen, and carbohydrates.
[/quote]

That is false. The randomly moving oxygen molecules bump into the molecules in the match, and that results in combustion. These are quantum level events. .

Starting with you last question first … and this is a good question since I have a broad enough definition of faith that I have to think (and look) harder to find a lack of it than a presence of it. But to give an answer, a lack of faith to me looks like somebody deciding not to trust in somebody or something when they had at least some warrant in which to ground such trust. If somebody told me they would do something and I knew them to be generally trustworthy in the past, but nevertheless I was still nervous that they might forget; so I do it myself anyway before giving them a chance to follow through on their word, that would be a lack of faith on my part. I also see it as more revealed in our actions and ways of living than perhaps in our words.

Let’s try this thought experiment. If I took a quarter, and right in front of you, placed it on the table heads up, would you consider it random that the coin was in that state? And in fact I proceed to tally up a lot of “flips” only I never really toss it, I carefully choose which I want (heads or tails) and put it on the table and then tally my results. And I even went to the trouble of matching a previously (flipped for real) sequence so that my choices cannot be distinguished from a random outcome. Would you say the my crafted coin results were random? If not, why not?

Given your previous posts, it would take faith to even assume that the other person exists to begin with. It would also take faith, according to your proposed worldview, to use any past experiences as a reason for doubting what somebody else says because those events may not be real, assuming through faith that you even exist as a person in a real world.[quote=“Mervin_Bitikofer, post:78, topic:36932”]
Let’s try this thought experiment. If I took a quarter, and right in front of you, placed it on the table heads up, would you consider it random that the coin was in that state?
[/quote]

No, I wouldn’t assume that it was random. I also wouldn’t assume that this example translates to all of nature. Just because we can find examples of non-random behavior does not mean that all of nature is non-random. That seems to be an unwarranted extrapolation.

Fair enough. The underlying question in the context of fine-tuning is: “Why those constants – constants that provide for habitability – and not others?” And, as you say, the tendency to address such “why?” questions probabilistically is misguided, however common. But to say that some event is “consistent” with a multiverse is … almost tautologous. :wink:

2 Likes