In the analog world of transducer input/output, you're in pretty good shape if you can consistently achieve performance — a term I'll use here as a combination of accuracy, resolution, and repeatability — of 0.1 percent.
Wait a moment: That's only 1 part in 1000, or 10 bits, which seems very coarse when it's alongside 16 through 24-bit converters, and processors crunching at 32 bit and more. I can understand the processors needing more bits, to minimize various cumulative errors, which will build up during repeated calculation steps. But what about the analog side?
Reality is that when you do a proper error budget all the way from transducer to A/D converter, taking into account the various analog signal-chain components, noise, tempco and time drifts, bias currents, and other factors, 0.1 percent is about what you'll get. Of course, you can do better if you trim the signal channel, and calibrate it; in fact, if you calibrate it under various conditions and operating points (voltage, temperature), you may get down to 0.05 percent (11 bits) or even 0.025 percent (12 bits). It's really hard to do better than that, and rarely needed.
Seeing that gap between real-world performance and the presumed number of bits available in the system taught me to step back and think about the subsequent analysis. The most important questions I learned to ask are these: “Does this data/answer/analysis make sense? Or does it make no sense at all?” I learned to be humble and skeptical about the acquired data and analysis, as it's easy to get so wrapped up in apparent precision that you miss the bigger picture, that something isn't quite right.
We all know the algorithm acronym GIGO, short for “garbage in, garbage out.” But there is a rarely mentioned alternate interpretation of GIGO, namely, “garbage in, gold out.” In other words, if the result says something is such-and-such, then it must be so — how can you doubt it?
Many years ago, I had an instructor whose admonition to us was simple: “Before you compute, stop and think.” He insisted we first rough out what the range of sensible answers and conclusions might be, using reasonable approximations. He wanted us to do what he referred to as “back of the envelope” calculations, and he was so insistent that I even made up a special “gag” pad entirely of envelopes, to demonstrate that what I had done was indeed just such a quick and rough calculation.
Have you ever been unintentionally misled by someone else's excess precision, when they were precisely wrong rather than roughly right? Worse, have you ever fooled yourself by jumping to precise yet misleading conclusions?