(This is a longer version of an editorial that appeared in EE Times on July 10, 2006)
I often hear the pundits and commentators say that a one definition of a fool (or worse) is someone who does the same thing over and over again, but expects different results. This tossed-off statement reveals a sad but common lack of knowledge of the engineering process by these supposedly smart people.
The reality is that it is fairly common for engineers to run the same test dozens or hundreds of times to get some statistics on the situation, to get some sense of input/output relationships, to validate a concept or implementation, or to try to understand what going on in their design. When you are looking for that elusive timing problem or software bug, that's usually the only way you can trap it. In fact, due to unavoidable, non-deterministic factors such as noise, you often need to run repeated tests to properly assess and assure performance.
It's even more critical when you drop down to the atomic and subatomic levels, where there is no simplistic, one-to-one action/outcome relationship. After all, the disposition of these quantum-level particles and waves only has meaning in a probabilistic sense. That's why researchers doing atom-smashing experiments do them over and over, as every trial yields different results in that strange world. Some leading technologies, such as quantum cryptography, actually depend on the roll-of-the-dice aspects of their process to function.
So should we take the easy path and blame the ignorance of the general public for consigning the type of repetitious tests engineers do to such low status? Sadly, I think that our profession bears a lot of the responsibility. Look in the mirror, and you'll see the source of the problem–we did it to ourselves, and we keep on doing it.
Our industry portrays the incredible innovation that is its driver as a linear, smooth, and straightforward process. We smoothly connect the dots of product design and development steps from A to B to C, as if the process was simple and clear-cut, with no need for backtracking and nor need for repetition of tests and trials. We promise and deliver on regular reductions in IC feature size, increases in wafer size, and enhancements in product capability.
Unforseeable developments such as the transistor, the IC, the iPod, lasers, LEDs, liquid crystals, just to name a few, don't have a place in this path, which is based on extrapolation as much as innovation.
Even the ubiquitous road map popularized by Intel and others (and now a mainstay of presentations to the public) emphasizes the perspective that we clearly know where we are going. It says that while there may be some obstacles, they are no big deal; there is no need to run trial after trial to figure out what's going on.
Too bad it isn't so. This progress takes sweat, insight, luck, brilliance, tenacity, and many other hard-do-define factors. It's not a deterministic process without surprises, with little need to do a procedure over and over to understand what we are seeing. Rather than the trite comment about this being the action of a fool, such repetition and examination of outcomes are part of our diligence while advancing the state of the art.