The Most Misquoted Man in Electronics-Industry History, Part 1

Most people with Internet access — even those well outside of the electronics industry — have heard of Moore's Law. Based on what has been attributed to its namesake, Fairchild Semiconductor and Intel co-founder Gordon Moore, few know what the man actually said, however, and fewer the context in which he said it.

First, Moore's Law isn't a law at all. Though the industry has allegedly stayed on a curve Moore described for five times the one-decade prediction window he offered, there is nothing immutable about the prediction. Yes, it's had an impressive run; no, you wouldn't want to tattoo it on your chest. It does not rank with expressions legitimately called laws, such as those collectively known as Maxwell's Equations or those ascribed to Kirchhoff. Nor did Moore claim any ultimate dependability for his observation.

On this basis, perhaps Moore's Law was an inspired self-fulfilling prophesy — what became the semiconductor industry’s most terse and effective bit of marketing, but, in the 21st century, more marketing than science: Rarely does one see a reference to Moore's Law that correctly states what Moore presented in his seminal 1965 article, “Cramming More Components Onto Integrated Circuits.”

What do they claim? Well, many sources, such as the supposedly authoritative if not particularly technical Merriam-Webster online dictionary, says that Moore's Law states that “processing power doubles about every 18 months especially relative to cost or size.” This would have been particularly clever were it true: At the time Moore wrote his article, the optimal maximum number of devices on a single die was 50 — sufficient for SSI logic and simple analog blocks such as early operational amplifiers and comparators. The first commercially-available integrated-circuit processor — the Intel 4004 — wasn’t seen for another six years, so the issue of “processing power” at the IC level was in no way central to Moore's discussion.

PC Magazine’s version of Moore's Law is that “the number of transistors and resistors on a chip doubles every 18 months.” Its grammatical slip notwithstanding — all chips maintain exactly the same number of transistors and resistors they had when fabricated — it misses the very same critical element that most others do when citing Moore: He wasn’t talking about what would be possible or even what would be common practice. Moore was suggesting that, at any level of IC technology development, there would be an economically optimal number of transistors to cohabit a die and that that number was likely to double every 18 months or so. This isn't a subtle difference: It lies at the core of what distinguishes commercially successful technologies from laboratory demonstrations and existence proofs, which is to say, something just south of a third of a trillion dollars in revenue worldwide.

Playing fast and loose with Moore's prediction isn’t limited to the popular press but, alas, is also an activity in which their expert sources participate. For example, the founding director of HP Lab's Information and Quantum Systems Laboratory, was quoted as saying, “Moore's Law in itself has evolved and morphed in time. It used to be the number of transistors in a chip, but now it means exponential growth in capability on a chip.”

The law has evolved and morphed? Resisting the temptation to try that trick with Newton for just a minute, even were we to agree to a redefinition of terms, how are we to measure this alleged “exponential growth in capability” with functional blocks that are not parametrically or functionally comparable? How are we to evaluate claims that we are or are not on Moore's curve if we've changed the axes along the way? As a good friend and mentor used to say, these are ideas that “melt in the ear, not in the mind” if we are to apply Moore's Law to engineering comparisons and not only as a peg to hold up marketing claims.

How did we, as an industry, lose track of Moore's original insight? See part two of this article.

4 comments on “The Most Misquoted Man in Electronics-Industry History, Part 1

  1. goafrit2
    March 1, 2013

    >> How did we, as an industry, lose track of Moore's original insight?

    We did not lose track of the original insight. Intel just invented the “law” to create a great marketing strategy. With this quest to keep the “law” alive, they remind you that you need to upgrade every 18-24 months.

  2. goafrit2
    March 1, 2013

    >> First, Moore's Law isn't a law at all.

    You are correct and I do not think Gordon has ever said it was a law. He noted that there is nothing like forever and never argued that it was law. The problem has been that Intel is simply enjoying this extrapolative imagination of packing more speed happens over a period of time even without any incremental intelligience in processors to rule the market. The world is not about speed, people want smart systems.

  3. eafpres
    March 1, 2013

    Your article reminded me of an article in The Economist newspaper in 2011 regarding an interesting correlation–that the energy efficiency of processors was doubling about every 1.6 years since Eniac.  The metric is in computations per KWh. Here is the article:

    Energy Efficiency of Processors

    Regardless of their headline, this also is not a “law”.


    March 5, 2013

    It could possibly be said that the area of a circuit could be reduced by half every 18 months. But this would not mean the chip size would be reduced equally since there are still mandatory space requirements such as margin around the die for wafer cutting, pad size and count, etc.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.