An Elegy for Moore's Law
The Jevons Paradox and Moore's Law
The Jevons Paradox derives from observations put forth by Walter Stanley Jevons in his 1865 book, The Coal Question , where he observed that the consumption of coal soared after James Watt introduced the coal-fired steam engine, one that greatly improved the efficiency of Thomas Newcomen's prior design . It's a proposition that technological progress that increases efficiency with which a resource is used tends to increase the rate of consumption of that resource. Applied to integrated circuits, a reduction in the cost of integrated transistors through process and circuit enhancements increases the rate of such integration, typically beyond benefit derived from such cost reduction. Yes, that observation neatly fits the trend proposed by Gordon Moore, or the eponymous Moore's Law coined so by Carver Mead. Sadly, prior knowledge from which such an observation could, by physics or by reason, be derived remained unattributed and unreferenced in that seminal article  published by Moore 100 years after Jevons's book.
A Cost-of-Integration trend proposed for integrated circuit components.
But that — a reduction in cost per manufactured component leading to increased integration of such components on a chip — was not the only prediction Moore offered in his article. He went on to add another graph, relating to the 'complexity' of components (how many components go into realizing an integrated function), that showed that the number of transistors would double every year. One in 1959, eight in 1962, and 64 in 1965 . To be fair, those were the only data points he could fit nicely those days, and with these three points, he boldly drew a trend line. Neat! In six years, his components per integrated function had risen to 26 or 64, and three points could indeed make a line, though two more data points he had made that line a bit crooked. Nevertheless, he bravely proposed this rule: transistors in chips would double every year .
That courage led to a flurry of activity in the industry — and repeated evolution of this rule. It became a doubling every two years, and then settled into a doubling every year and a half . You've got to make do for the vagaries of manufacturing and any market challenges that came along. This is the so-called 'tick' of fabrication process development or the 10nm → 7 → 5nm lithographic minimum dimension forecast — which may be slowing down further these interesting days.
What would Jevons Say?
The Jevons Paradox could be employed to describe the progression of transistor integration, computing performance, frequency, and chip power over the decades since early IC development. Recall that Jevons observed that the more efficient use of a resource leads to greater utilization of that very resource, typically overcoming the cost benefit of the efficiency improvement. Reducing the cost of integration of transistors, therefore, leads to the use of more transistors, or greater integration, surpassing the initial cost benefit. See how well that fits Figure 1? Similarly, increased chip operating frequencies, or improved performance, fuels greater demand for frequency increases, and leads to transistors scaled inexorably smaller to permit frequency increases while attempting to contain power consumption. This leads to challenges relating to power, energy, and power integrity, described in detail in . While Moore observed an integration trend, Jevons foresaw potential consequences of unrestrained technological progress in general.
Applying the Jevons Paradox to the critical question of energy consumption, increased energy efficiency leads to an increase in energy utilization, which potentially leads to greater strain on energy infrastructure. But this thought experiment ignores many practical limits to such progress, which at the electronic system level include efficient energy delivery, increased cost, and the availability of applications to exhaust energy upon. In complex systems, it is more that increased efficiency in a critical component promotes a similar increase in efficiency in cooperative critical components. Nevertheless, the realization that one could integrate rapidly overcame any thought of whether one should , and this led to dramatic power and energy challenges .
Exponential growth in transistor integration has led to challenges for support systems such as power delivery networks; IC design and computer-aided IC design tools were no exception. As USLI systems scaled further, traditional design and verification of chips employing polygonal placement, electrical parameter extraction, and circuit simulation became exponentially more difficult, requiring novel techniques for the analysis and management of power integrity. Humans are great at problem-solving and tool-making; this and its communication is the challenge we've applied ourselves to at Anasim Corp..
Why was it Moore's Law?
Blame that on Carver Mead, who was consulting then at the company Gordon Moore co-founded. In the long and revered tradition of appending one's sponsor's name to one's flighty ideas, Mead saw fit to coin the term “Moore's Law.” But this was no physical law; it had no basis in transistor device physics or mechanical fabrication principles as far as I can tell. It was based upon data, manufacturing results, and reliable engineering estimation. And it was catchy, a slogan, a call to arms that the budding IC industry could adhere to.
So, Moore's Law came into being, and reigned regally for the past many decades. Astonishing developments in electronics can indeed be attributed to the labor, the industry, that was inspired by the “law.” And the Intel 4004 designed in 1970 (or was it '71?) contained about 211 or ∼2048 transistors. But just don't expect a processor (or any sort of chip, 3D, multi-layer or interposer assembly) to contain 260 — a pentillion — transistors in 2019, 60 years from Moore's 1959 data point.
An elegy , as you know, is a lament, typically for something that has passed on, something dead to put it bluntly.
I've long felt that such non-physical trends and observations are more an imposition on the scientific mind than a guide to sustainable development. I think naming such a trend a “law” adds insult to this injury — and is more an expression of ego, a projection of someone's lack of humility, than wisdom that soothes and enlightens. Besides, lacking any references to this gem of insight into semiconductor physics gained, it smacks of something pulled out of — you know what I mean.
Has it been useful? Undoubtedly so, for the progress of many semiconductor companies and much technology development depended upon it. Many have profited from this observation and efforts aligned with it.
Yet is it true? Is it a near-timeless law of nature? If not, it's best that it dies a good death as many other good things are known to do. As Gordon Moore himself declared, in 2003, “No exponential is forever.” Let that be true.
 Wikipedia, The Free Encyclopedia, retrieved 2009.
 Raj Nair, “A power integrity wall follows the power wall,” Anasim Corporation, 2008.
 Raj Nair and Donald Bennett “Power Integrity Analysis and Management for Integrated Circuits,” Prentice-Hall PTR Signal Integrity Series, 2010.
 Masanori Hashimoto and Raj Nair, Power Integrity for Nanoscale Integrated Systems,” McGraw Hill, 2014.
 Gordon E. Moore, “Cramming more components onto integrated circuits,” Electronics, Vol. 38, No. 8, April 19, 1965.