Anthropology Magazine

The Scientific Sorcery of Radiocarbon Dating

Column / Curiosities

The Scientific Sorcery of Radiocarbon Dating


An archaeologist explains why figuring out an object’s age is harder than you think.

Several years ago, I went back to Chicago to see some old friends: artifacts, really—ancient sandals to be precise.

The sandals were at the Field Museum of Natural History in Chicago, Illinois, where I worked in the early 2000s. They had been found in Tularosa Cave, New Mexico, in 1950, and their construction from reed matting made clear they were from early Indigenous people of that region, a group that archaeologists call the Mogollon (pronounced MOH’-gee-yoan). But I wanted to know exactly how old they were, so I was returning to collect samples for radiocarbon dating.

Back in the museum’s Room 38, a cavernous space that once hosted my office, I took each of a dozen sandal fragments out of their protective, archive-quality packages. I carefully turned them around in my gloved hands, searching for a discrete place from which to cut a small, 15-milligram-size piece (smaller than your pinky fingernail). I packed each sample into a small packet, filled out some paperwork, and sent them off to a laboratory for processing. Several days later—voila! I got the results. The oldest sandal dated to 1,710 ± 40 radiocarbon years before present.

Radiocarbon dating is an astonishing thing. On the surface, it seems so simple: submit samples and get dates! But there is a long and fascinating history behind this seemingly magical technology, and the answers it spits out are not always as clear-cut as they seem.

The origins of radiocarbon dating trace back to chemists, including Harold Urey (pictured), and the Manhattan Project. U.S. Department of Energy/Flickr

The technique holds a special place in my heart. Maybe it’s because, as an archaeologist interested in determining the age of things, radiocarbon dating is astonishingly useful. Perhaps it’s because I grew up on the University of Chicago campus, which played a vital role in the origins of radiocarbon research. Or maybe it’s because the history and application of radiocarbon dating has everything that fascinates me about the history of science: award-winning scientists, unexpected glitches, and startling revelations.

The story begins on December 8, 1941, a day after the Japanese bombed Pearl Harbor. In response to that horrific event, physical chemist Willard Libby of the University of California, Berkeley, decided to join the war effort by offering his services to physical chemist Harold Urey of Columbia University, whose lab was studying different radioactive atoms, including uranium. (Urey won the Nobel Prize in chemistry in 1934 for his discovery of deuterium, a heavy version of hydrogen.) Urey would soon be heading part of the research for the Manhattan Project, the massive effort to develop an atomic bomb before the Germans or Japanese did. Urey accepted Libby’s offer, and Libby moved to New York.

In the meantime, something remarkable happened on the south side of Chicago, just a few short miles south of the Field Museum. On December 2, 1942, in a squash court (you read right!) underneath the athletic stands at the University of Chicago’s Stagg Field, physicist Enrico Fermi oversaw the first controlled and Everything with carbon in it—from your lunch to yourself—has a little radioactive carbon in it.sustained nuclear chain reaction in the blandly named Chicago Pile-1, a huge block of material consisting of about 770,000 pounds of graphite, more than 80,000 pounds of uranium oxide, and more than 12,000 pounds of uranium. In plain English, Chicago Pile-1 was a nuclear reactor. That experiment paved the way for America’s entry into the nuclear age.

At the end of the war, Libby left Columbia to join Fermi’s lab in Chicago. There he really began to focus on carbon and its remarkable properties. In 1946, Libby published a short letter to the editor of Physical Review in which he pointed out that, based on his research and that of others, there should be a difference between the amount of radioactive carbon in dead and fossilized organisms when compared to modern, living organisms. That insight marked the birth of radiocarbon dating.

Every moment of every day, cosmic rays bombard Earth’s upper atmosphere. Those cosmic rays interact with nitrogen atoms to produce atoms of radioactive carbon-14 (also written as 14C). There’s just a tiny amount (about one in a trillion) of this kind of carbon atom when compared to the amount of nonradioactive 13C and 12C atoms on the planet. But this tiny scattering of 14C gets everywhere: It mixes up in the air, which is soaked up by plants, which are eaten by animals, and so on up the food chain.

Everything with carbon in it—from your lunch to yourself—has a little radioactive carbon in it. That radioactive carbon is in equilibrium with the environment around it. As Libby realized, when those things die, the process of absorbing any new radioactive 14C stops, and however much radioactive 14C was there starts to decay. The longer a thing has been dead, the less 14C it has left in it.

The half-life of 14C is 5,730 ± 40 years. That means that after 5,730 years, half of the 14C in any given object will have decayed. After another 5,730 years, half of that remaining 14C will have decayed too, so there is a quarter of the original amount, and so on. To derive a radiocarbon age for an archaeological sample, a scientist simply measures the amount of radioactive 14C remaining and calculates how many years it has been since that organism died.

That’s absolutely amazing. But it isn’t actually quite so simple.

Physicists, including Libby, initially assumed that 14C was spread uniformly around the planet. But it’s not. That can have a dramatic effect on radiocarbon dating. Take modern shellfish from coastal Peru, for example: Their radiocarbon date suggests that they’re about 400 years old. Are they related to Rip Van Winkle? No! As remarkable as it may be, they seem that old because they are ingesting seawater that has been on a 400-year trip from its time at the surface in the North Atlantic, where it soaked up its original content of 14C. These mollusks look as old as the water they eat (or, more accurately, filter).

radiocarbon dating

Burning fossil fuels injects old carbon, deprived of carbon-14, into the atmosphere. Gerry Machen/Flickr

Carbon-14 production also isn’t constant over time. Scientists now know that 14C production depends on sunspot flares, changes in the intensity of the earth’s magnetic field, and other geophysical phenomena. Ever since the Industrial Revolution, people have been flooding the atmosphere with old, 14C-deficient carbon by burning fossil fuels. Then there was the nuclear bomb. From 1950 to 1964, the United States, the Soviet Union, and other countries conducted above-ground nuclear tests that injected an abundance of radioactivity into the upper atmosphere. These are some of the reasons that radiocarbon dating isn’t used on objects less than about 300 years old.

Scientists have had to continually tweak and twiddle the conversion equations that tell archaeologists like me how old an object is based on its radiocarbon date. There are rules and calibrations unique to marine life from one ocean to another, from one hemisphere to another, and from different material types—from bones to plants to freshwater clamshells. Researchers keep learning weird and wonderful ways that radiocarbon dates can be “wrong” and how to correct for those errors. Indeed, scientists continuously update the calibration curves that are used to convert radiocarbon “years” to calendar years because the two are not the same.

Remember that sandal from Tularosa Cave that was dated to 1,710 ± 40 radiocarbon years old? Because of all these complicated factors, I had to convert that radiocarbon “date” to calendar years. Scientists have developed a calibration curve based on the radiocarbon dating of individual, annual growth rings in ancient bristlecone pine trees from California and ancient oaks from Europe. That calibration curve tells me that the sandal samples actually date to between 40 B.C. (2,060 years ago) and A.D. 120 (1,900 years ago) on the Christian calendar. In other words, that sandal was made by a Native American from plants that lived when ancient Rome dominated the Mediterranean and the Han dynasty dominated China.

Although imperfect, radiocarbon dating has revolutionized archaeology. (Stay tuned for my next column for examples of that.) To me, it is a complete marvel that archaeologists can use a technique from nuclear physics to date ancient materials up to 70,000 years old—including my beloved sandals from New Mexico.