Radiocarbon dating wrong
The method was developed by Willard Libby in the late 1940s and soon became a standard tool for archaeologists.Libby received the Nobel Prize in Chemistry for his work in 1960.
The researchers collected roughly 70-metre core samples from the lake and painstakingly counted the layers to come up with a direct record stretching back 52,000 years.Various geologic, atmospheric and solar processes can influence atmospheric carbon-14 levels.Since the 1960s, scientists have started accounting for the variations by calibrating the clock against the known ages of tree rings.Research has been ongoing since the 1960s to determine what the proportion of in the atmosphere has been over the past fifty thousand years.The resulting data, in the form of a calibration curve, is now used to convert a given measurement of radiocarbon in a sample into an estimate of the sample's calendar age.“If you’re trying to look at archaeological sites at the order of 30,000 or 40,000 years ago, the ages may shift by only a few hundred years but that may be significant in putting them before or after changes in climate,” he says.
Take the extinction of Neanderthals, which occurred in western Europe less than 30,000 years ago.
Other corrections must be made to account for the proportion of throughout the biosphere (reservoir effects).
Additional complications come from the burning of fossil fuels such as coal and oil, and from the above-ground nuclear tests done in the 1950s and 1960s.
Histories of archaeology often refer to its impact as the "radiocarbon revolution".
Radiocarbon dating has allowed key transitions in prehistory to be dated, such as the end of the last ice age, and the beginning of the Neolithic and Bronze Age in different regions.
Climate records from a Japanese lake are set to improve the accuracy of the dating technique, which could help to shed light on archaeological mysteries such as why Neanderthals became extinct.