Tuesday, May 1, 2012

How Does Radiocarbon Dating Work?

Like most things in this world, I never gave radiocarbon dating (the method archaeologists use to determine the approximate age of stuff they dig out of the ground) a whole lot of thought.  It seemed obvious enough how it worked; some small percentage of the carbon in everything is apparently the unstable carbon-14 isotope.  That isotope has a finite, known half-life; over time, it'll decay into the more stable carbon-12 and carbon-13 isotopes.  The result of all this is that you should be able to estimate the age of things by the amount of carbon-14 left in them.  Assuming you can avoid sample contamination by "modern" carbon-bearing materials (which I would guess is really hard), radiocarbon dating is apparently pretty accurate for measuring the ages of things.  Or that's what I assumed; like I said, since it doesn't really have anything to do with my field, I didn't give it a whole lot of thought.

After writing that post a few weeks ago on why the earth's crust is mostly silicon though, I realized something: basically all the carbon in the universe was formed in supernovas during the first couple of billion years after the Big Bang.  That means that the carbon in atmospheric CO2, animal remains, and anything else you might find on Earth should be 1) pretty much all the same age, give or take a billion years, and 2) significantly older than the Earth itself.  How can you use atoms from the primordial universe to determine the relative ages of things from the (fairly short, on the cosmic scale) history of planet Earth?

Per usual with this kind of thing, I started out with a couple of entirely wrong assumptions.  The first was that all the carbon on earth, in all its various isotopes, was Big Bang detritus.  In fact, it turns out that carbon-14 (C-14 from here on out, because typing is hard) is continuously produced in the atmosphere via interaction between atmospheric nitrogen and the high-energy cosmic rays that are constantly bombarding the earth.  The cosmic rays produce neutrons in the upper atmosphere, which can replace a proton in one of the nitrogen molecules floating around out there.  The result is an atom with the same atomic weight as nitrogen (because protons and neutrons weigh the same) but an atomic number (proton count) reduced by 1, which if you check the periodic table corresponds to our unstable C-14 atom.  Said C-14 atom will eventually combine with oxygen to form atmospheric CO2, which is then taken up by plants, etc etc, until basically every living thing has some small quantity of the stuff in them. 

The half-life of C-14 is surprisingly short, about 5750 years.  That means that the stuff would be long gone from the Earth if the supply of it wasn't being constantly re-upped by cosmic ray bombardment, which solves the "all carbon is as old as the universe" problem.  It does make things tricky though, because now you've got two questions to answer when you want to carbon-date something:

1) How much of the C-14 in the thing you're dating has decayed?

2) How much C-14 was there begin with, and how old was it?

You need at least a reasonable approximation of both of those numbers to accurately determine the age of something. 

The critical concept here is that a living organism is constantly refreshing its internal supply of C-14.  Whether it's a plant taking up CO2 as part of its respiration cycle, an animal eating that plant, or a bigger animal eating that animal, atmospheric CO2 is constantly making its way into the internals of every living thing on Earth.  As a result, the ratio of C-14 to C-12 in any living thing is going to be approximately identical to the ratio in the atmosphere, which we can treat as constant over time.  When an organism dies, it stops doing all the things that would normally refresh its supply of C-14 (eating and breathing, for example), making death a handy "t=0" point for carbon dating.  Basically, if you know the half-life of C-14 (which we do), you can approximate the time of death of any once-living thing (or anything that was originally a part of a living thing, like fabrics) by comparing the ratio of C-14 to C-12 atoms in a sample of it to the atmospheric ratio.  Since half-life is a relative measurement (it tells you the ratio of the current amount of isotope to the initial amount at a given time), the age of the C-14 already in the body doesn't matter.  It's worth mentioning that carbon dating only works on discrete ex-lifeforms; trying to carbon-date something like soil or peat will just give you a mess, since there's bits of things that all died at totally different times in there.

The most interesting consequence of C-14's much-shorter-than-I-thought half-life is that it sets a pretty firm limit on the maximum carbon-dateable age of stuff.  The older something is, the more of the C-14 in it has already decayed, and the more difficult accurately measuring the remaining concentration of it will be.  In practice, you can carbon-date things back to about 10 C-14 half-lives, or ~60,000 years. At that point, the C-14 ratio has decayed to less than 1/1000th of its initial value; as you approach that limit, measurements get much more difficult and susceptible to contamination too.  So really radiocarbon dating is only useful for figuring out how old once-living things that fall roughly within the blip of time when humans have been around are.  Whether because of pop-cultural portrayal or just the fact that I'm naturally incurious, I'd always just assumed carbon dating would give you the age of anything you wanted, regardless of composition or oldness.

(Wikipedia, as usual)