[Originally published as Archaeological Investigations, pt 1]
Aside from geological resources, there are those who look to ancient cultures and their records for ammunition against the Bible and its historical accounts. They claim that the histories of several cultures immediately dispel the authenticity and authority of the Bible because the timelines are not compatible.
For instance, most adherents of Biblical young-earth creationism hold that the planet and all that exists on it are less than ten thousand years or so in age, with a great many such believers going further, trusting instead that creation is actually less than seven thousand years old.
Such a recent creation stands in stark opposition to the claims of certain notable cultures, such as the Egyptians and the Babylonians, each having made claims of their civilization’s antiquity at ages of 11,000 years and 730,000 years respectively. In each case, the cultures have provided an apparent wealth of evidence, including various chronicles and genealogies, to support their claims.
As such, what is one to make of these accounts? Do the histories of these and other cultures immediately discredit a literal reading of Genesis, thereby dismantling the trustworthiness of the Bible as a whole? Could it instead be that something else is going on?
Let us take a moment to consider a few of the ways in which the antiquity of a culture is established.
Some of the more common methods used by archaeologists and other researchers involve the use of various radioisotopes, thermoluminescence dating, optically stimulated luminescence dating, paleomagnetic & archaeomagnetic dating, and dendrochronology studies among others.
Several forms of radioisotope dating methods utilizing a range of elements are put to archaeological use in an effort to ascertain the age of ancient artifacts. The most widely known, of course, is the carbon-14 to nitrogen-14 dating technique. As we covered earlier, the existence of carbon-14 (with a half-life of only 5,730 years) in diamonds and coal, both of which are assumed to require millions or even billions of years to form, discredits the trustworthiness of this technique.
Also, since the method relies on organic carbon, it can only be used to date materials that were once alive, thus eliminating the possibility of testing rock layers themselves. Likewise, potassium-40 to argon-40 has been demonstrated to be equally faulty. What’s more, the method is exclusive only to igneous rock, thus omitting any accurate usage on sedimentary strata.
Other radiometric dating methods sometimes used include uranium-238 to lead-206 dating, rubidium-87 to strontium-87 dating, and samarium-147 to neodymium-143 dating. Each of these methods however is limited to use in igneous or metamorphic rock types, and cannot accurately be used in sedimentary strata because of its typically amalgamated structure.
Furthermore, various incidents have discredited the use of each a number of times, including strong discordances between anticipated artifact ages and those yielded by each test. Aside from the doubt raised by such incidents, there of course are the dreadful assumptions necessary to trust the results of any such radiometric dating technique, including the beliefs that:
- the nature of radiometric decay is one of a closed system, whereby exogenous sources of interference and contamination are ignored wholesale
- that the amount of the parent isotope is known in the original sample
- that the amount of daughter isotope is known in the original sample
- that the rate of radioactive decay has remained constant over the years, and…
- that the atmosphere and other such factors affecting radioactivity have remained constant since the time of deposition.
Any reasonable person should be able to admit that, from our temporal position, we have no way of confirming these variables, and thus have no way of verifying the veracity of the radiometric dates based on them. As such, here in archaeological studies as in geological studies, radiometric dating cannot be trusted.
To be continued…