We all know that the “normal” human-body temperature is 98.6°F (37.0°C). It’s the standard, accepted number, repeated as a fact with almost the same certainty as the speed of light or other NIST-defined fundamental constants. But is that number correct?
Some people always seem to have a lower body temperature or a higher one – and that’s “normal” for them – but for the rest of us, we’ve been told that the 98.6°/37° number is the measurement threshold to determine if a person has a low-grade fever about which there’s no need to worry, a high-grade fever and thus medical issues, or perhaps no fever at all.
Well, maybe, or maybe not. Perhaps that number was correct but is no longer so, or perhaps it was never quite right. A recent detailed, credible study by a medical-research team at the Stanford University School of Medicine concluded that the hallowed, enshrined number may now be high (at least in the United States) by about 1°F (≈0.5°C).
A little history may help: That standard of 98.6°F was initially determined by German physician Carl Reinhold August Wunderlich, who published the figure in a book in 1868, along with other studies based on data such as medical records from veterans of the Civil War. The authors of the new study analyzed that data and sources, then outlined some possible reasons for the change in “normal”:
- basic thermometry was not advanced, and the calibration of those thermometers used was not assured;
- due to shortcomings in public health at the time (bad sanitation, dirty water, animal-transmitted diseases, lack of antibiotics, and more) people in those days often lived continuously with low-grade infections; such a commonplace (but not debilitating) condition would have raised their temperature slightly;
- finally, human physiology has likely changed over the past centuries, due to introduction of air conditioning, more moderate (heated/cooled) indoor-living conditions, shifts in our basic metabolism due to changes in body mass, muscle, diet, and other factors.
I found the study interesting, as it is not just a case of enhancing the precision of a metrology standard or an instrumentation reading in some of its highest-precision digits. Instead, this is a large-scale re-definition of the nominal value by a significant percentage. (For more information, check out the full paper “Decreasing human body temperature in the United States since the Industrial Revolution” published at eLife, or the summary “Human body temperature has decreased in United States, study finds” at the Stanford Medical School site.)
It makes me wonder, are there situations in engineering where long-held data, assumption, guidelines, or so-called “rules of thumb” turn out to be incorrect, or perhaps were applied too broadly? For example, the Wright brothers – who were actually meticulous scientists and engineers, not just bicycle-shop owners and tinkerers despite the simplistic popular portrayal – found their wing-lift results fell short of what they should have seen based on tables published by leading, respected sources of that period. As the brothers were not able to account for the discrepancies, they went back to basics, built their own wind tunnel, ran formal tests, and obtained very different results for airfoil performance.
Have you ever been involved in a project where some of the fundamental assumptions turned out to be inaccurate enough to affect the design or analysis? Did you ever have to do some deep digging yourself to very basic “facts” that everyone assumed were correct, due to their longevity and the “everyone knows” factor?
- JPL & NASA’s latest clock: almost perfect
- Initiating a Short-Circuit Fault for Battery Test
- The Wright Brothers: Test Engineers as Well as Inventors
- When extreme-precision numbers are legitimate
- Give Unique Test Gear Some Respect
- Perfect Sensor, Imperfect Test
- An introduction to acoustic thermometry