rsewell:
plasma
Plasma that emits as much light as a small galaxy, as I calculated here:
Given that Uranium and Thorium are the most common radioactive elements, we can get an estimate of the energy released by looking at them. Both require about 1 half-life to go by in six months to produce the measured ages. Given their abundance in the earth and earth’s mass, that is roughly 4x10^44 becquerels. Alpha decays emit about 10 MeV each. Hence, 3x10^32 joules get poured into the earth in that time. For reference, that is slightly more than Earth’s binding energy, or about the amount of sunlight that hits the earth every billion years. That is also enough to raise temperatures at 1000K/s, re-melting the crust in about a second and vaporizing the earth in 10. By the end of six months, the ball of plasma once known as earth would be about 50 MK, producing a Planck curve which peaks in keV x-rays and outshining the sun by 9 orders of magnitude, thus contributing 10% of the energy output by the galaxy.
The other option (besides messing with the strong nuclear force) for speeding up decay is to dramatically decrease the speed of light, which would decrease atomic binding energy by the same factor, which would solve that part of the heat problem. Unfortunately, it would also make protons and neutrons fall apart, so it doesn’t really fix anything.
Just plate movement is bad enough:
500,000,000 km^2 * 30 km thick gives 1.5x10^19 m^3 for the crust, given a mean density of about 2.2 g/cm^3, that is 3.3x10^19 kg. Compressing plate tectonics into 6 months gives a velocity of ~40 m/s. Hence, a kinetic energy of 2.6x10^22 J. Given that most of that energy becomes heat, the energy-density of the crust will go up at a rate of something like 1K/s.
That’s on par with a microwave, but this is running for six months. The earth will only take half an hour or so to re-melt the crust, and if the energy input continued for the full six months the earth would achieve temperatures comparable to the sun’s core.
3 Likes