Explore

Radioactive dating no problem for the Bible

by

Photo iStockphoto Atom

What about radioactive dating? Doesn’t that prove the world is millions of years old? Radioactive dating may be one of the big questions looming in your mind.

But the idea of an unimaginably old earth did not come from radioactive dating. It was popular long before radioactivity was discovered (see Western Culture and the Age of the Earth). It came from a geologic philosophy, not a scientific measurement.

Note too that radioactive dating is something that most people don’t understand. Normal people are not familiar with isotopes, mass spectrographs, rubidium, strontium or half lives. We find ourselves in the position where we are being asked to trust the specialists, of not being able to check the facts first hand.

But it’s not difficult to understand the basic principles to realize that alleged ages of millions of years have not been measured objectively, but derived from subjective assumptions.

What is radioactive decay?

Radioactive dating begins by carefully measuring the concentrations of radioactive isotopes in rocks. Everything is composed of elements and there are about 90 naturally occurring ones, such as hydrogen, carbon, oxygen and iron. Each element comes in different forms, called isotopes, most of which are stable and do not change. Some isotopes, however, are unstable and decay radioactively into other elements. There are many different radioactive isotopes that are used for radiometric dating.

For example, there is a radioactive form of potassium (potassium-40) that decays into argon (argon-40). The unstable potassium isotope is called the parent while the argon product is called the daughter. There are a couple of different radioactive forms of uranium that decay into lead. There is a radioactive form of thorium that also decays into lead. There is an isotope of samarium that decays into neodymium, and one of rubidium that decays into strontium.

How does radioactive dating work?

Radioactive dating is often illustrated with an hour glass. The sand grains at the top of the sealed glass are like the atoms of the parent isotope in the rock, and those at the bottom like the atoms of the daughter. Radioactive decay is where the parent atoms change as a result of radioactive decay into daughter atoms, like the individual grains of sand falling from the top to the bottom of the glass. The hourglass depends on the sand falling at a regular rate. Like an hour glass, it is said, you simply measure the parent and the daughter elements and you can calculate the age.

What was the starting amount?

However, an hour glass is only useful if we saw it turned over and observed that the bottom glass was empty. In other words, the hourglass only works when we know its initial condition. Unlike the hourglass, we do not know how much of each isotope was in the rock in the beginning. That’s because we did not observe what happened in the past when the rock formed. Neither can we travel into the past to make the necessary measurements. All we can do is guess. This is the fatal problem that essentially makes radioactive dating useless as a primary method for determining age.

Geologists don’t like to assume the amount of daughter directly (perhaps that sounds like cheating), but they often do, and they call it a ‘model’ age. Geologists prefer to make indirect assumptions. They may assume that different minerals in the rock originally had the same isotopic ratios to start with. Or they may assume that different rock samples from the same geographical area had the same ratio. Each dating method uses different kinds of assumptions to get around this problem for radiometric dating—the deadly problem caused by the fact that we cannot make measurements in the past.

Has the rock been disturbed?

Photo iStockphoto Hourglass

Apart from the fatal problem of not knowing the initial conditions, there is another problem that is just as deadly. We don’t know what happened to the rock during its ‘lifetime’.

An hour glass is only useful if it is not disturbed. But after rocks crystallize from molten magma, they can be heated and cooled; they can be affected by metamorphic events and groundwater. These geologic events can cause elements to be gained and lost to the rock. It’s like cracking the hourglass and having some of the sand leak out, or other sand leak in. How can we know what disturbances have affected the elements in our rocks? Again, we can only guess.

Every date has to be interpreted

Did you hear about the old wood cutter who was bragging about his axe? ‘I’ve had this trusty axe for fifty years,’ he said. ‘It’s only had two new heads and three new handles.’ The question is: how old was his axe?

It’s much the same with rocks. When a geologist hammers off a sample of rock he needs to know its history. Different minerals would have crystallized at different times depending on the way the molten magma cooled. Some small pieces of other rock, or even some foreign minerals, may have been carried along by the magma and existed long before the rock crystallized. Other minerals may have grown inside the rock much later, during a time when the area was heated and metamorphosed. Some minerals may have crystallized even later still when ground waters in the area percolated through the pores of the rock. So the age of a rock is quite a complicated question, and we first need to know its entire history before we can develop a story to explain the isotopic measurements.

This means that, on its own, a radioactive ‘date’ is meaningless. Geologists recognize this. You may be surprised to learn that a geologist would never collect a rock at random and send it off for radioactive dating on its own. The result would mean nothing. Every radioactive date has to be interpreted before anyone can say what it means.

What happens is that the geologist will carefully record exactly where he collected the rock. He explores the geology of the area so he can understand the geological history, and where his particular sample fits into the sequence of geological events. He checks out the ages other geologists have assigned to the different rocks in the region. He studies samples of his rock under the microscope looking for clues of how it crystallized, whether it was later heated, deformed, altered or weathered.

Then, when the laboratory sends him the ‘date’ for his rock, he can decide what the date refers to. Does it represent the time the rock crystallized or when it cooled? Or perhaps the date refers to the time when the rock was heated or deformed or altered, or somewhere between two of these. Or maybe the date refers to an earlier time, a time when the magma melted before the rock even formed. So the geologist has a lot of options he can choose from as he develops a story to explain the meaning of the date for his rock. He can even combine a number of different explanations to explain his result.

And even after the geologist has interpreted his date and published his interpretation in a journal, another geologist may later decide that there is a problem with that interpretation, and say the date should be disregarded or reinterpreted.

So radiometric dating never has the final word. It’s not objective like the lay-person is led to believe.

Has the decay rate ever changed?

An hourglass is only useful for telling time if the sand always falls at the same rate. An hourglass can be disturbed if it tips over, is shaken, or gets moisture inside.

Likewise, radioactive dating will only be reliable if the radioactive decay rate of the isotopes has never been disturbed. Each different kind of isotope decays at a regular, repeatable rate called its ‘half life’. It is generally believed that the decay rates for isotopes would never change, even under the sorts of conditions that could be experienced deep inside the earth, or even inside other planets.

However, there is one survey of the scientific literature that refers to more than two dozen experiments where changes in decay rates were reported.1 Laboratory experiments have quantified, for certain radioactive decay processes, how much the rate is affected by the chemical and physical conditions, but in these cases the changes observed are small.2 On the other hand, it has been demonstrated in the laboratory that under certain conditions the radioactive decay rate can be accelerated a billion fold. 3

Some may argue that these sorts of conditions would not apply on the earth, or that the changes are only small in most cases. But in recent years, a group of seven creationist research scientists, called the RATE group,4 has identified examples in the field that point toward accelerated nuclear decay. 5

They have also developed a theoretical basis for how accelerated decay could occur.6

The fact is that we cannot travel into the past so we cannot know all the different conditions that have existed on Earth and to which rocks may have been subject. So, the idea that decay rates have remained absolutely constant over all time is a belief, not a fact. And even secular scientists have sometimes proposed that the decay rate changed in the past in order to resolve a disagreement between the age of the earth and the age of the universe.

Not objective measurement, but subjective assumption

We are all familiar with measuring time so we should easily see that radioactive dating is not everything it’s claimed to be. In an Olympic race, for example, the official starts his stopwatch when the starting gun sounds. He stops his watch when the athlete touches the finish line. He reads the time from his watch. But what would happen if he missed the beginning of the race and only saw the finish? It would be impossible for him to measure the time, no matter how accurate his watch. We all know that, so we should all see the inherent problems with radioactive dating.

Every ‘scientific’ dating method, including radioactive dating, needs to know the initial conditions of the rock. But, unlike the Olympic official, we were not present at the beginning so we can only assume how the rock formed and what the conditions were. Not only that, but we must also assume what happened to the rock during its lifetime.

Clearly, radioactive ‘dates’ are not independently-determined objective measurements of age. Rather, all dates are based on subjective assumptions. And because long-age researchers don’t take the Bible’s history seriously they make assumptions that are inconsistent with it. That’s why their answers contradict the Bible. But the numbers they quote are all based on assumptions and don’t disprove the biblical timescale at all.

Published: 30 April 2008

References

  1. Hahn, H.-P., Born, H.-J. and Kim, J.I., Survey on the rate perturbation of nuclear decay, Radiochimica Acta 23:23–37, 1976. Return to text.
  2. Huh, C.-A., Dependence of the decay rate of 7Be on chemical forms, Earth and Planetary Science Letters 171:325–328, 1999. Return to text.
  3. Woodmorappe, J., Billion-fold acceleration of radioactivity demonstrated in laboratory, Journal of Creation 15(2):4–6, 2001. Return to text.
  4. RATE stands for Radioactivity and the Age of The Earth. Return to text.
  5. Snelling, A.A., Radioisotope dating of rocks in the Grand Canyon, Creation 27(3):44–49, 2005. Return to text.
  6. Chaffin, E.F., Accelerated decay: theoretical considerations; in: Vardiman, L. et al. (Eds.), Radioisotopes and the Age of the Earth Vol. II, ICR, El Cajon, CA, CRS, Chino Valley, AZ, pp. 525–586, 2005. Return to text.

Related Articles

Helpful Resources

Thousands ... Not Billions
by Dr Don DeYoung
US $10.00
Soft cover