Wednesday, January 11, 2012

Why Do We Use Such Stupid Time Units?

Here in the US of A, we like our units of measurement obsolete and ridiculous.  Unlike basically the rest of the civilized world (no, England and Australia, you don't count as part of the civilized world), we continue to use anachronistic and unwieldy units of measurement like the foot, the mile, the pound, and the fluid ounce in our everyday lives.  Aside from unnecessarily confusing elementary school kids though, this is a pretty harmless indulgence: in science, engineering, and most other areas where precise measurement is important, we pretty much just suck it up and use the metric system.

In very general terms, the metric system uses a single (usually arbitrary, but whatever) base unit for each type of measurement, e.g. the meter for distance, the gram for mass, and the liter for liquid volume.  You measure everything in terms of multiples of the base unit, often with fun prefixes grafted on as shorthand for orders of magnitude (e.g. one gigameter = 1 billion meters).  It makes converting between units easy and you don't have to remember things like that there's exactly 5280 feet in a mile or that water boils at 212 degrees Fahrenheit.  There's a metric base unit for almost every possible thing you could want to measure, from simple things like distance (meters) to somewhat more complex units like electrical potential energy (volts).

The big exception, if you hadn't already guessed it from the post title, is time.  Our basic unit of time is the second, which we pretend has been carefully defined to be how long it takes a photon to travel about 3 x 108 meters in vacuum but is really just an arbitrary thing that got settled on millennia before anyone even knew that light had a finite velocity.  Like I said though, pretty much all the base metric units are arbitrary so how we came by it is really not important.  It only gets weird when you start scaling up: 60 seconds is one minute, 60 minutes is one hour, 24 hours is one day, and 365.25 days is a year (I'm not even going to touch months here).  The latter two have solid physical groundings; it takes the earth one day (86400 seconds) to complete a full rotation, and 365.25 days (31557600 seconds) to complete a full orbit around the earth.  So we'll keep those, thank you.

What I've never understood is the minutes and hours part of the scale: why 60 seconds to a minute, why 60 minutes to an hour, and why 24 hours to a day?  Ordinarily I wouldn't care, but unlike most of the stupider units still floating around today I actually have to deal with minutes and hours quite a bit in my job as a science-person, and it's a total pain in the ass.  So where did these three oddball units come from, and why have they stuck around even in metric-system-dominated fields?

The answer to how the day got divided into 24 chunks is kind of obvious if you think about ancient civilizations and how little of the universe they could actually observe and measure.  The key to everything is the number 12, which is the number of lunar cycles in a year.  The moon was one of the very few extraterrestrial things that could be easily observed by ancient civilizations, so this was apparently considered a big enough deal that the Sumerians, Indians, and Chinese all (apparently independently?) decided to ignore the number of fingers on their hands and split the time between sunrise and sunset into 12 chunks instead of the slightly more logical 10.  For the sake of symmetry, the Greeks and Romans eventually got around to splitting the night up similarly, giving a total of 24 "hours" per full day.

An early issue with the "between sunrise and sunset" reckoning of hours was the fact that the time between sunrise and sunset will vary quite a bit depending on the time of year.  This was generally worked around by just letting the length of the hour change with the seasons.  Since it isn't like pre-technological civilizations had any real need for precise time synchronization, this probably mattered less than my horrified engineer's brain seems to think.  Even pendulum-based clocks could be made to account for the varying length of the hour by just adjusting their swing period every once in awhile.  Eventually (around 100BC) the Hellenistic-period Greeks realized this was dumb and came up with what's more or less our current system, where hours are always the same length and sunrise/sunset just occur at different times throughout the year.

So that's why there's 24 hours in a day, but why are there 60 minutes per hour (and, since the answer turned out to be the same, 60 seconds per minute)?  Blame the Babylonians, who were very clever astronomers but not so good with things like fractions (to be fair this was a long time ago and they were pretty much inventing math as they went along).  They decided to go with base-60 math for their astronomical calculations, since you can divide 60 by 2, 3, 4, 5, 6, and 10 without having to figure out what to do with a remainder.  Because this was handy, it stuck and has been used as the sub- and sub-sub-division of the hour since about the time that the Greeks standardized the hour itself, about 100BC.


That's all great, but it brings us to the real question: why are we still using these odd time units that are based on things that long since became irrelevant?  Like most questions like this, the answer is sort of a squishy mix of convenience and convention.  Basically every civilization on this planet has been keeping time with the 24/60/60 scale for over 2000 years, which is a lot longer than any other unit of measurement has survived.  So in spite of its general unwieldiness we're pretty used to it and it basically does what it needs to do (exhibit A here: the fact that even the godless socialists in Europe never tried switching to metric or decimal time).  Even the set of Standard International (SI) units (the metric system, basically) attempts to square the circle on this one; the only official time unit in SI is the second, which you can multiply up or down with prefixes the way you would with any other basic unit of measurement.  However, both the minute and hour are considered what amounts to honorary SI units; they aren't in the club, but enough people use them anyway that someone thought they should at least be defined in relation to the second and have official unit suffixes and stuff. I'm sure someone somewhere is measuring some time-based process in kiloseconds or megaseconds, but quite honestly even most scientists would think that was pretty weird and just grab a calculator to convert to hours, days, or years.

6 comments:

  1. There's a Simpsons episode where Mensa takes over the town where Principal Skinner explains, "Not only are the trains now running on time, they’re running on metric time. Remember this moment, people: 80 past 2 on April 47th." Unfortunately, I couldn't find it on Youtube.

    ReplyDelete
  2. I was actually thinking about that episode when I wrote this, since it's the only mention I've ever seen of metric time anywhere. I think the fact that MIT has an Esperanto club but nobody trying to use metric time kind of says it all.

    ReplyDelete
  3. I saw this today which I had never heard about, but plays into the same sort of mess caused by tying time to the sun.
    http://finance.yahoo.com/news/countries-consider-time-leap-second-103956872.html

    ReplyDelete
  4. Believe it or not, I'm pretty sure the whole complex system used to propagate UTC time across all computer systems was originated at UD. So we missed a golden opportunity to mess with TIME ITSELF all those years ago.

    ReplyDelete
  5. Just as a note, there was decimal time for a very brief period during the French Revolution - 1793 I believe. It didn't really take though.

    ReplyDelete
  6. We could metrify the day, but ten has few factors. Since there are 2 dozen hours in a day, 5 dozen minutes in an hour, 5 dozen seconds in a minute, we could metrify in base twelve. People don't like change though.

    ReplyDelete