Donate SIGN UP

Why Celsius?

Avatar Image
David H | 00:37 Sat 05th Aug 2006 | Science
17 Answers
Why has the world except the US decided Celsius is superior to Fahrenheit, especially when Fahrenheit is far more precise with smaller increments?
And while I'm here why did Mr Celsius become worthy of the scale rather than the accurate description of centigrade? It is only equivalent of decimal as in coinage.
Gravatar

Answers

1 to 17 of 17rss feed

Best Answer

No best answer has yet been selected by David H. Once a best answer has been selected, it will be shown here.

For more on marking an answer as the "Best Answer", please visit our FAQ.
1795-1995 Bicentenary of the Decimal Metric System
On 7 April 1795 the National Convention of France decreed the new "Republican Measures" to be legal measures in France. The units of measurement included the meter, are, liter, and gram; and the prefixes centi, deci, deca, hecto, and kilo. This was the decimal system of measurement units or the decimal metric system that has survived practically unchanged as the basis of the modern International System of Units, or SI for short.

"A TOUS LES TEMPS; A TOUS LES PEUPLES"
["FOR ALL TIME; FOR ALL PEOPLES"]

blame the French
Fahrenheit is not "more precise"; it all depends how accurately one measures things. The accuracy depends on the quality of the manufacture of the thermometer, not the increments used.
"And while I'm here why did Mr Celsius become worthy of the scale rather than the accurate description of centigrade?"

Probably for a similar reason that the other scale is named after Mr Fahrenheit rather than the accurate description: "Arbitrary".
slight aside, but what's the scale called measured in F increments that starts at absolute zero?
It is to Fahrenheit what Kelvin is to Celsius. Denoted by an R I think.
OK well try converting -40 to centigrade
Er, cause water freezes at 0 degrees celcius and boils at 100 degrees Celcius.

Celcius is easier.

The US isn't always right, on many stuff.
Question Author
Thank you everyone, so the curse of Napoleon continues... At least we still drive on the left so the US slipped up somewhere.
I say Fahrenheit is more precise as you can get a couple more numbers in and especially in medical recording sometimes one whole degree can be crucial. I liked having 70 and 71 instead of 20 but I suppose the march of metric pushes on.
Ugly_bob......

Absolute Zero is -273.15C or -459.67F, which is the lowest temperature theoretically possible. You may be thinking of the Reaumur Scale where the melting point of ice is taken as 0R and the boiling point as 80R. Named after Rene Antoine Reaumur 1683-1757. Have no idea if it is used for anything these days, perhaps someone else will be able to enlighten you.
Snowy Owl - Not sure what the sclae was - never actually seen it used anywhere, just remember on my calculator at school having the option to convert between 4 different temp scales - Centigrade (C), Fahrenheit (F), Kelvin (K) and something denoted as R which started at absolute zero (like Kelvin) but incremented at levels equal to Fahrenheit.
Funny really but I did think that putting a decimal point and then numbers such as 38.7456 allows you to be as accurate as you want, but maybe you bought a cheapo thermometer with only full divisions on it !!
I think that Mr Fahrenheit was seriously lacking in imagination then he introduced his scale, unlike Mr Celcius. Of course, Mr Celcius could have used 1000� instead of 100� for the boiling point of water. That would have given David his smaller increments. But at higher temperatures, the numbers might have got out of hand.
As to 'why 'Celcius' instead of 'centigrade' - blame the Americans. They think it's a swell idea to name things after their discoverers. In the same way, our electricity used to be delivered at 50 cps (cycles per second). Now it comes in Hertz. Foot-pounds changed to some other thing called Newtons, etc, etc. Fine, except they ravaged the self-explanatory labels and replaced them with names you have to look up to find out what the heck they're talking about.
ugly_bob : you're thinking about the Rankine scale I suppose
David H: As bernardo said, none of these temperature scales are "more precise" than another, because you can use as many decimals as you want, you don't have to use whole numbers.
heathfield: foot-pounds are not Newtons...

Doesn't the word "centigrade" also apply to the Fahrenheit ? It is defined by its 0 and 100 measurements, just like the Celcius scale is.
The reason Celcius became popular is that it is its two reference points ( 0 and 100 ) were chosen to be more reliable than the ones Mr Fahrenheit used.

The reason the US prefers Fahrenheit is that a lot of people just assume that whatever they are used to is better than the rest, even when it's not.
Space - wrong on only three points. 1) Read what I said, that foot-pounds changed into some other thing called Newtons, i.e., a different measurement of torque. 2) Centigrade's 0 to 100 does not apply to Fahrenheit. Fahrenheit's scale is 32� to 212�, corresponding to centigrade's 0� to 100�. 3) Fahrenheit and centigrade are equally reliable at measuring temperature, it's only the scales that are different. Otherwise it's like saying 'centimetres are more reliable than inches'.
heathfield, if you think Newtons are a measure of torque... then please don't label what I write "wrong" ...

2) and 3), you are talking about the current definition of the Fahrenheit scale, I was talking of the one defined by Mr Fahrenheit.
By the way, if you really want to be picky, the current definition of the Celcius scale isn't even centigrade anymore, because it is no longer defined by its 0 and 100 since 1954 (according to wikipedia)

1 to 17 of 17rss feed

Do you know the answer?

Why Celsius?

Answer Question >>