They are indeed strange numbers for such common, everyday goings-on as the freezing and boiling of water.
We’re stuck with them because a German glassblower and amateur physicist named Gabriel Fahrenheit (1686–1736) made a couple of bad decisions. Gadgets for measuring temperature had existed since about 1592, even though nobody knew what temperature was, and nobody had tried to attach numbers to it.
Then in 1714 Fahrenheit constructed a glass tube containing a very thin thread of mercury—a nice, shiny, easily visible liquid—that went up and down by expansion and contraction as it got hotter and colder. But Fahrenheit’s thermometer, like all that preceded it, was like a clock without a face. He had to put numbers on the thing, or else how could anyone complain about the weather?
So Fahrenheit had to devise a set of numbers to inscribe on his glass tubes, such that the mercury would rise to the same number on all thermometers when they were at the same temperature. And that’s when Gabriel blew it.
Historians still speculate about what must have been going through his mind, but the following might be a good guess. First, he decided that because a full circle has 360 steps or degrees, it would be nice if there were also 360 steps—and why not call them degrees—between the temperatures of freezing water and boiling water. But 360 steps would make each degree too small, so he chose 180 instead.
That fixed the size of the degree: exactly 1/180th of the distance on the tube between the freezing and boiling marks. But what, he wondered, should the actual numbers be? Zero and 180? 180 and 360? Or, heaven forbid, 32 and 212? (212 ‒ 32 = 180, right?) Well, he stuck his thermometer into the coldest concoction he could make—a mixture of ice and a chemical called ammonium chloride—and called that temperature “zero.”
(What arrogance, Gabriel! Would nobody in human history ever be able to make a colder mixture? Why, two centuries later, we can make temperatures almost 460 degrees below your zero.) When he took his own temperature, the thermometer went up to around 100 degrees. (Okay, 98.6, but see the following for how that number came about.) That was a touch that Fahrenheit liked: Humans, he felt, should score 100 on his temperature scale.
Next, he stuck his thermometer into an ice-water mixture, and found that the mercury went up 32 degrees higher than in his zero-temperature mixture. And that’s how the freezing point of water came to be 32 degrees. Finally, if boiling water was to be 180 degrees higher than that, it would wind up at 32 + 180, or 212. End of Gabriel Fahrenheit’s story. Six years after Fahrenheit’s body temperature became equal to that of his surroundings, a Swedish astronomer named Anders Celsius (1701–1744) proposed the centigrade scale of temperature, which we now call the Celsius scale. “Centigrade” means 100 degrees; he set the size of a degree so that there are 100 of them, not 180, between the freezing and boiling points of water.
Furthermore, he defined his “zero temperature” at the freezing point of water, a reference point that anyone could easily reproduce. And thus, the boiling point of water fell at 100 degrees. (Curiously, for reasons known only to eighteenth-century Swedish astronomers, Celsius originally took the freezing point as 100 and the boiling point as zero, but people turned it around after he died.)
And what about that number 98.6 as the “normal” human body temperature? It’s just a fluke. People’s temperatures vary quite a bit depending on the time of day, the time of month (for women) and just plain differences in metabolism. But it wavers around an average of 37 degrees Celsius for most people, so that’s what doctors have adopted as “normal.” And guess what 37 degrees Celsius converts to in Fahrenheit? Right—98.6, a number that looks for all the world as if it were more precise than it really is. That extra six-tenths of a degree is nothing but an accident of the conversion arithmetic and has no significance at all. Speaking of conversions, I can’t resist the opportunity—I do it every chance I get—to publicize an easy way to convert temperatures. I don’t know why they continue to teach those complicated formulas in school, with all their 32s, parentheses and improper fractions, when there is a much simpler way that’s absolutely accurate.
To convert Celsius to Fahrenheit, add 40, multiply by 1.8, then subtract 40.
To convert Fahrenheit to Celsius, add 40, divide by 1.8, then subtract 40.
That’s all there is to it.
It works because (a) 40 below zero is the same temperature on both scales and (b) a Celsius degree is 1.8 times larger than a Fahrenheit degree. (180 ÷ 100 = 1.8.) A final point: Thermometers measure only their own temperatures. Think about it. A cold thermometer registers a low temperature; a hot thermometer registers a high temperature.
A thermometer doesn’t register the temperature of an object that you stick it into until it itself warms up to, or cools down to, that object’s temperature. That’s why you have to wait for the fever thermometer to warm up to your body’s temperature before you read it.