One of the many irritations I have with so much of the “reporting” I read is the poor use (intentional or not) of statistics and data. I have seen stories talking about how a change in some parameter from 4% to 8% is a rise of 4 percent, when it is really a rise of four percentage points or a doubling of the initial value. I have seen misuse of significant figures, such as a “one-kilometer” distance also being reported as 0.6213 miles, which is arithmetically correct but implies precision far beyond the original casual metric measure.
Sad to say, I have even seen that mistake in scientific-journal papers; either the author is naïve, to be polite, or the presumed reviewer didn’t do the job. And try explaining the difference between accuracy and precision—often a waste of time. Note that the subject of numerical illiteracy is not new, and was forcefully discussed in the 1988 book Innumeracy: Mathematical Illiteracy and its Consequences (review)” by math professor John Allen Paulos (who apparently coined the very descriptive term “innumeracy” as well). Claims of extreme precision and accuracy are especially critical in analog design, as it is very tough to develop a circuit and system with consistent performance to 0.01%, while in the digital world you can claim as much precision as you like just by adding more bits to the numerical representation.
That’s why I was somewhat shocked when I saw a headline indicating that a new thermometer read 0.001°C. No way, I figured: that’s either a simple error in writing or some excessive zeal in unit conversion. The story was taken directly from NIST (National Institute of Standards and Technology) announcement, and the NIST people are pretty careful with their significant figures, precision, accuracy, and related numeracy issues. After all, look at the care they and complementary institutes around the world used when redefining the fundamental physical constants for the artifact-free SI system, Figure 1 and References 1 through 4.
So I read further, and found that the 0.001°C number was correct: a NIST team had developed a thermal-infrared radiation thermometer (TIRT) for the -50°C (-58°F) to 150°C (302°F) range (corresponding to infrared wavelengths between 8 to 14 micrometers) which can measure temperatures with precision of a few thousandths of a degree Celsius. Even better, it does not require cryogenic cooling, as many other high-performance infrared temperature sensors do.
First question, of course, is who needs that? They say it is needed for medical and scientific studies, such as satellite-based measurement of the surface temperature of oceans. The second question is: so, how did they achieve this? It’s a combination of a new approach plus the usual tactic of understanding and analyzing every error source and then seeking to minimize or eliminate it, plus adding a feedback mechanism to stabilize performance around known points.
NIST’s Ambient-Radiation Thermometer (ART), Figure 2, uses an set of internal thermometers to constantly gauge temperatures at different points in the instrument; those readings are then used in feedback loop which keeps the 30-cm (12-inch) core cylinder containing the detector assembly at a constant temperature of 23°C (72°F).
The unit also includes additional focusing features to reduce errors due to IR radiation getting into the instrument from outside the targeted field of view (called as size-of-source effect), Figure 3. The design is described in full detail in the paper with the very modest title “Improvements in the design of thermal infrared radiation thermometers and sensors,” published in Optics Express from the Optical Society of America.
My third question was: show to you calibrate such an instrument, to be able to claim such performance? Again, they have certainly done their homework, as you would expect. The paper describes two techniques using their advanced infrared radiometry and imaging facility with the standard platinum resistance thermometers (RTDs). This was done with both water-bath and ammonia heat-pipe blackbodies along with extensive corrections due to the real-world physics of blackbody radiation and RTDs. No doubt about it: this is impressive theoretical and applied science and engineering, and one of the few cases where I can accept claiming measurement to 0.001°C as meaningful.
Have you ever had a design or function which mislead you due to unjustified claims of precision in the initial analysis or actual specifications?
References (all from NIST)
- “For All Times, For All Peoples: How Replacing the Kilogram Empowers Industry”
- “A Turning Point for Humanity: Redefining the World’s Measurement System”
- “Toward the SI System Based on Fundamental Constants: Weighing the Electron”
- “Universe’s Constants Now Known with Sufficient Certainty to Completely Redefine the International System of Units”
- “Precise Temperature Measurements with Invisible Light”
- What analog’s “imperfections” taught me
- Do analog ICs still need perfection?
- ‘Never say never’ when it comes to Einstein’s measurement supposition
- Optical measurement of temperature raises existential questions
- Thermocouples: Simple but misunderstood
- Clever techniques improve thermocouple measurements by Jim Williams
- Thermocouple physics: Even Jim Williams got it wrong
- Stay off the HOT SEAT when choosing temperature sensors