5 assumptions of radiometric dating
Radiometric dating is a method of determining the age of an artifact by assuming that on average decay rates have been constant (see below for the flaws in that assumption) and measuring the amount of radioactive decay that has occurred.
Radiometric dating is mostly used to determine the age of rocks, though a particular form of radiometric dating—called Radiocarbon dating—can date wood, cloth, skeletons, and other organic material.
In the case of carbon dating, it is not the initial quantity that is important, but the initial ratio of C, but the same principle otherwise applies.
Recognizing this problem, scientists try to focus on rocks that do not contain the decay product originally.
The energy locked in the nucleus is enormous, but cannot be released easily.
The phenomenon we know as heat is simply the jiggling around of atoms and their components, so in principle a high enough temperature could cause the components of the core to break out.
For these reasons, if a rock strata contains zircon, running a uranium-lead test on a zircon sample will produce a radiometric dating result that is less dependent on the initial quantity problem.
Although radiometric dating methods are widely quoted by scientists, they are inappropriate for aging the entire universe due to likely variations in decay rates.If it does not entirely contradict them, we put it in a footnote.And if it is completely 'out of date', we just drop it."Few archaeologists who have concerned themselves with absolute chronology are innocent of having sometimes applied this method...As it decays, it disrupts the crystal and allows the lead atom to move.Likewise, heating the rock such as granite forms gneiss or basalt forms schist.