> Only the truly stupid would assume accuracy from decimal places.
Well then, tell all the teachers in this world that they're stupid, and tell
everyone who learnt from them as well. I'm in high school (gd. 11, junior)
and my physics teacher is always screaming at us for putting too many
decimal places or having them inconsistent. There are certain situations
where adding a ±1 is too cumbersome and / or clumsy, so you can specify the
accuracy using just decimal places.
For example, 5.00 would mean pretty much spot on 5 (anywhere from 4.995 to
5.00499), wheras 5 could mean anywhere from 4.5 to 5.499.
Please, let's quit this dumb argument. We all know that thermistors and
other types of cheap temperature gauges are very inaccurate, and I don't
think expensive thermocouples will make it into computer sensors very soon.
Plus, who the hell could care whether their chip is at 45.4 or 45.5 degrees?
Does it really matter? A difference of 0.1 will not decide whether your
chip will fry.
Just my 2 eurocents.
-- Chris Boot bootc@worldnet.frDOS Computers manufactured by companies such as IBM, Compaq, Tandy, and millions of others are by far the most popular, with about 70 million machines in use worldwide. Macintosh fans, on the other hand, may note that cockroaches are far more numerous than humans, and that numbers alone do not denote a higher life form. New York Times, November 26, 1991
- To unsubscribe from this list: send the line "unsubscribe linux-kernel" in the body of a message to majordomo@vger.kernel.org More majordomo info at http://vger.kernel.org/majordomo-info.html Please read the FAQ at http://www.tux.org/lkml/