Why 32° and 212° for Fahrenheit?

Thread Starter

Wendy

Joined Mar 24, 2008
23,798
I asked this question on Google, as near as I can tell 0° Fahrenheit is the freezing point of salt water versus freshwater.and at 32°F you add 180° to get the 212° we know and love for boiling. I'm still wondering why Fahrenheit didn't make the jump and just select 0° and 100° ready two end points of the thermometer . I guess it seems more obvious after the fact?
 

SamR

Joined Mar 19, 2019
5,472
Because Daniel Gabriel Fahrenheit was Polish? He was also a thermometer maker. Instead of all the various Celsius to Fahrenheit conversion formulas, I taught my Chemistry students the basis of the scales. Celsius based on properties of water the scale is 100° from freeze to boil which is 180° so the factor is 1.8. But since Fahrenheit uses 32° as the freezing point of water you either have to add or subtract that depending on which way you are converting. A lot of Chemistry and Physics is based on the properties of water.

Actually...
Daniel Gabriel Fahrenheit - Wikipedia
"According to Fahrenheit's 1724 article,[13][14] he determined his scale by reference to three fixed points of temperature. The lowest temperature was achieved by preparing a frigorific mixture of ice, water, and a salt ("ammonium chloride or even sea salt"), and waiting for the eutectic system to reach equilibrium temperature. The thermometer then was placed into the mixture and the liquid in the thermometer allowed to descend to its lowest point. The thermometer's reading there was taken as 0 °F. The second reference point was selected as the reading of the thermometer when it was placed in still water when ice was just forming on the surface.[15] This was assigned as 30 °F. The third calibration point, taken as 90 °F, was selected as the thermometer's reading when the instrument was placed under the arm or in the mouth.[16]"
 
Last edited:

crutschow

Joined Mar 14, 2008
38,331
I remember reading that 100°F was selected as the typical body temperature (but Fahrenheit was slightly off as it's actually is 98.6).

I like the Fahrenheit scale because it works well for human scale temperatures.
You know that 0°F is really physically cold, 50°F is cool, and 100°F is really hot.
Celsius doesn't have that handy reference.

Also a 1°F delta seems to be a nice increment of control change for a building thermostat.
1°C delta is too large.
 
Last edited:

MrChips

Joined Oct 2, 2009
34,630
A long time ago I learned that 32 and 212 had nothing to do with the choice of temperature scale.
0 was pinned as the coldest winter temperature ever experienced and body temperature was marked at 100. End of story.
 

MrChips

Joined Oct 2, 2009
34,630
Experienced by whom? How was it recorded?
To be useful as a reference point it should be reproducible, and I don't see how a one-off weather event could be.
Come on. We're talking about scientific experimentation conducted in the 1700's when the first mercury thermometer was invented. We're not talking about NIST traceability here.

(It was the coldest temperature experienced in Denmark by Danish astronomer Ole Romer. How was it recorded? By putting a mark on the temperature sensing device available at the time. Remember, no temperature scale existed at the time besides firey hot, warm, and bloody cold.)
 

cmartinez

Joined Jan 17, 2007
8,727
If I remember correctly, 100 °F was referenced as the boiling point of some type of alcohol.

The big blunder IMHO that Mr Fahrenheit made was using two different substances as a reference for his scale.
 

MrChips

Joined Oct 2, 2009
34,630
Temperature Scale Timeline

1702 - Danish astronomer Romer creates the Romer temperature scale
1708 - German-Polish physicist Fahrenheit visits Romer in Copenhagen
1713 - Fahrenheit creates the Fahrenheit scale
1742 - Swedish astronomer Anders Celsius creates Celsius temperature scale

Fahrenheit modified the Romer scale because he was mathematically challenged. He rounded 7.5 to 8 and 22.5 to 24 in order to make calculations easier.
 

crutschow

Joined Mar 14, 2008
38,331
Out here in the rest of the world we use decimal places to deal with that problem.
Of course you do (in the rest of the world).
But set the temperature 1 degree higher is easier to say then set the temperature 0.5 degrees higher.
love Celsisus... so simple...
We used Celsius for all our device temperature tests at work as it's the standard for all technical temperature measurements, and I learned to live with it, but I still prefer Fahrenheit since 0-100 is a nice range for normal cold-to-hot ambient temperature readings.
I think I'll stick to Celsius ... thank you very much ...
To each his own.
I'm not trying to persuade you to do anything else.
 

SamR

Joined Mar 19, 2019
5,472
Come on. We're talking about scientific experimentation conducted in the 1700's when the first mercury thermometer was invented. We're not talking about NIST traceability here.
If you read the Wiki link I provided you will find this:
1719241408175.png
Apparently it was Romer who was the real father of the Fahrenheit scale... Fahrenheit was making inferior thermometers and was taken to task by Romer for it. So Fahenhiet "modified" Romer's scale to improve the accuracy of his thermometers.
 
Top