I agree on an old universe, but radiometric dating has some flaws.
If the Cambrian Explosion represented creation around 7000 years ago, and observed speciation in fossils (within clades) was merely rapid speciation, this would actually fit the fossil record satisfactorily. Rapid speciation being an observed phenomenon occurring primarily via changes to allele frequencies with an occasional beneficial mutation.
The biggest factor within geology pointing to long time frames is radiometric dating. Which has some flaws in it. Other geological events can occur quickly.
Welcome to BioLogos, Wade. While I do think itâs true that some species can rapidly adapt to geological disturbances and other environmental phenomena, I wouldnât feel comfortable using that to support a 7,000 year old date for Planet Earth. There are some other discussions on BioLogos where the questions youâre raising here might fit in a bit better. This particular thread has been a confusing mishmash of thought experiment and rejection of the thought experiment. So good luck!
Hi, thanks for the advice. Just giving background to my belief, Iâm a Bible literalist and therefore not YEC, because the earth existed and was formless and empty, before even the first day that creation week started.
So I have no objection to the science that points to an old earth and an old universe. Genesis 1 mentions terrestrial plants and living organisms, defining living organisms as those with breath of life, a term associated with life-blood.(hemoglobin)
The existence of microbes and small shellies before the Cambrian Explosion is irrelevant to creation week. These organisms are not mentioned in Genesis 1. The Cambrian explosion is creation this is when animals with life blood are created.
Anyway thatâs my view for the record.
Klax
(The only thing that matters is faith expressed in love.)
4
As that an additive factor or a multiplicative factor? If the latter, then âyearsâ is superfluous and confusing. So the Cambrian was a hundred thousand times nearer to us now - five hundred and forty one million years out - due to what flaws in radiometric dating?
Thanks for pointing out my superfluous use of the word âyearsâ. My terminology isnât always clear.
The flaws as shown by the Purdue studies on solar flares, and seasonal fluctuations. Radiometric decay rates were thought of as a constant, yet it is not a constant. The effect appears currently negligible on the studied isotopes. But until the cause and effect is sufficiently understood and until the isotopes used in radiometric dating (longer half lives) are emphasized in these studies, there will always be doubts as to the accuracy of theorized dates.
Klax
(The only thing that matters is faith expressed in love.)
6
I wouldnât call a factor of one hundred thousand negligible Wade. Can you point to that on Wikipedia or some other disinterested scientific source?
I said the observed fluctuations are negligible, but need to be studied more.
Until they are studied further we cannot fully trust the dates given, because there is an unknown factor that influences the presumed constancy of decay.
The studies have not emphasized the isotopes used in radiometric dating, they have instead focussed on isotopes with shorter half lives. So the effect on those isotopes with longer half lives is unknown. We cannot trust a method with unknown and insufficiently studied flaws.
The studies even show seasonal fluctuations in decay rates previously thought to be a constant, and there are attempts to eliminate instrument error in the findings :
Welcome to the forum, Mindspawn. I do not think the studies say what you think they say. Looking at them, they say the variation is a fraction of a percent, within the known error bars of testing. In other words, so small that it does not matter for the purpose of dating. On the other hand, a different of a factor of say even a thousand means a huge difference. It is like telling the highway patrolman your speedometer was off and you were really going 20.1 in the 20 mph school zone, when he clocked you at 20,000 mph in your new Saturn V Veliciraptor.
Many studies have been done, showing that decay is reliable. While the specifics are beyond my understanding, you can even look at decay rates in old supernovas to show that even in the distant past, decay rates have not changed. This article discusses it about have way down: https://www.astronomynotes.com/solfluf/s4.htm
So , while you are free to doubt the reliability of radiometric dating, the studies you quoted on variable rates only confirm that it is valid within its known limitations, with minor variations in decay being negligible.
Thereâs something very important that you need to realise about radiometric dating â and in fact, about measurement in general â here.
Unreliability can be quantified.
There is a massive difference between unreliability in the sense of âpossibly out by a fraction of one percentâ and unreliability in the sense of âso out of whack that it consistently fails to distinguish between thousands and billions.â The former case is just error bars. The latter case is science fiction.
The young-earth organisations themselves have admitted that accelerated nuclear decay on the scale required to squeeze all the radiometric evidence into just six thousand years would have released enough heat to raise the temperature of the Earth to 22,000°C. That was one of the conclusions of the RATE project â the most extensive, expensive and comprehensive YEC investigation into the reliability of radiometric dating ever conducted. They have no explanation where the heat could have gone other than an appeal to ad-hoc miracles that serve no purpose whatsoever other than to make the Earth look older than it really is in the most complicated and convoluted way imaginable for no obvious reason.
That link is just text book stuff, based on old assumptions of constancy.
I agree this unknown effect is observed under current conditions to be negligible. My logic stands that the effect is unknown. Therefore we cannot be sure the effect will remain negligible under all conditions, if we do not even know how it works. Thatâs pretty obvious, considering itâs unknown. If many scientists confidently tell me the effect will always be negligible, I can only question their commitment to science. How can you know, if the cause/effect is unknown?
Iâve heard that argument before, but it is based on an accumulation of radioactivity to current levels.
If decay had always been rapid, the accumulated radioactivity of earth would be less, and the earth would be at equilibrium as it is now. This is a point few will understand, the point stands nevertheless
Sure if you release all the accumulated radioactivity now, there would be a huge problem, I agree fully with that.
Feel free to ignore those textbooks and the accumulated observations of thousands of scientists over decades of work. But donât fool yourself into believing you have a scientific argument based on observed findings, when it is a faith belief based on a particular interpretation of scripture.
Sure I respect science and love science. Science is always open to evidence and logical thinking and good hypotheses and theories. This is why I love it. Itâs in flux, ready to be improved.
On the contrary, we do know how nuclear decay works. The effects involved â the strong and weak nuclear forces, the interactions between the particles that make up the atomic nucleus, the standard model of particle physics and so on â are well understood, having been studied extensively and rigorously in nuclear reactors, particle accelerators and astronomical observations for over a century. On the scale of an atomic nucleus (about 10-15 metres), these effects predominate overwhelmingly over any external environmental effects such as temperature, pressure, electromagnetic radiation and so on. Even small changes in nuclear decay rates would require radical new laws of physics for which there is simply no evidence whatsoever. As for the much larger changes needed to collapse the evidence form 4.5 billion years down to six thousand â that isnât going to happen, itâs as simple as that.
And how, precisely, does being âbased on an accumulation of radioactivity to current levelsâ call it into question?
If decay had always been rapid, then we would see evidence of that in nature. In particular, we would see radiometric results diverging from each other in very consistent and mathematically coherent ways. We certainly wouldnât see radiometric results for the past 80 million years lining up precisely with rates of continental drift as determined by direct GPS readings:
Sure you may respect science and love science, but do you understand science? Does your view of science consist mainly of the results that it produces, or are you familiar with the methods used to produce those results? Have you been trained in maths, logic and the exact, rigorous and disciplined ways of thinking that science demands?
Thatâs a fascinating article. I have no objection to an old earth and old universe. I just believe the organisms actually mentioned in Genesis 1, were created less than 10000 years ago, but the earth did exist before creation week as described in Gen 1:1-2
Regarding the accuracy of the dates in the article, itâs entirely possible that decay rates are different in space compared to the assumed constant rate, the study of fluctuations in decay rates is still in its infancy, as we do not even know what causes the fluctuations.
No itâs not. We have astronomical observations that show theyâre exactly the same, as @jpmpointed out in his post above. And the rates are not assumed, they are measured.
No itâs not. As I pointed out, nuclear decay is a mature and robust area of study. Furthermore, the single study that appeared to show solar-dependent fluctuations in decay rates has never been replicated and was in fact falsified in 2014: