All of the above would be serious problems, but the plate movements and increased decay rates would be especially bad.
500,000,000 km^2 * 30 km thick gives 1.5x10^19 m^3 for the crust, given a mean density of about 2.2 g/cm^3, that is 3.3x10^19 kg. Compressing plate tectonics into 6 months gives a velocity of ~40 m/s. Hence, a kinetic energy of 2.6x10^22 J. Given that most of that energy becomes heat, the energy-density of the crust will go up at a rate of something like 1K/s.
That’s on par with a microwave, but this is running for six months. The earth will only take half an hour or so to re-melt the crust, and if the energy input continued for the full six months the earth would achieve temperatures comparable to the sun’s core.
Given that Uranium and Thorium are the most common radioactive elements, we can get an estimate of the energy released by looking at them. Both require about 1 half-life to go by in six months to produce the measured ages. Given their abundance in the earth and earth’s mass, that is roughly 4x10^44 becquerels. Alpha decays emit about 10 MeV each. Hence, 3x10^32 joules get poured into the earth in that time. For reference, that is slightly more than Earth’s binding energy, or about the amount of sunlight that hits the earth every billion years. That is also enough to raise temperatures at 1000K/s, re-melting the crust in about a second and vaporizing the earth in 10. By the end of six months, the ball of plasma once known as earth would be about 50 MK, producing a Planck curve which peaks in keV x-rays and outshining the sun by 9 orders of magnitude, thus contributing 10% of the energy output by the galaxy.
That is to say nothing of the fact that increasing the weak nuclear force by a factor of 10^10, and decreasing the strength of the strong nuclear force by about 10^10 would prevent any atoms larger than hydrogen from existing.