Analog Is More Important Than Digital – Scientific Proof

My first job out of college was testing and debugging control loading systems on flight simulators for 747 and the “new” 767 airliners. In the late 1970s, the control loop that simulated the feel of the primary flight surfaces, and had to respond instantaneously to pilot inputs, was purely analog; the digital portion consisted of a 32-bit Gould SEL “super minicomputer” with schottky TTL and a screamingly-fast 6.67 MHz system clock; it merely provided voltage inputs via DACs into an op amp summing junction which represented slowly-changing parameters such as airspeed and pitch angle.

“Op Amp”, of course, is short for “Operational Amplifier”, first developed to perform mathematical operations in analog computing. Forget about FFTs, DSPs, and all that nonsense: back in the mid-70s (when analog giants walked the earth and the Intel 4004 was barely out of diapers) there was an analog IC for just about everything – multiplication and division, log and antilog operations, RMS-DC conversion, you name it.

With the later rise of the Dark Side, of course, many of those old analog components, as well as the companies that gave them life, have breathed their last.

At least some of those functions survive, either as standalone parts or (sigh) as microcontroller functional blocks. And the real world, thankfully, remains stubbornly analog, which means that most of the truly interesting “digital” problems are really analog problems – grounding, crosstalk, race conditions, noise, EMC, etc.

We humans are products of the real world, too. Are we analog or digital?

The information that makes up a unique human being is mostly to be found in two places, in our genes and in our brains. The information in genes can be considered digital, coded in the four-level alphabet of DNA. Although the human brain is often referred to as an analog computer, and is often modeled by analog integrated circuits, the reality is more nuanced.

In a fascinating discussion on this subject, computational neuroscientist Paul King states that information in the brain is represented in terms of statistical approximations and estimations rather than exact values. The brain is also non-deterministic and cannot replay instruction sequences with error-free precision; those are analog features.

Figure 1

Analog, digital, and neuron spiking signals (source:Quora)

Analog, digital, and neuron spiking signals (source:Quora)

Figure 1: Analog, digital, and neuron spiking signals (source:Quora)

On the other hand, the signals sent around the brain are “either-or” states that are similar to binary. A neuron fires or it does not, so in that sense, the brain is computing using binary signals.

The precise mechanism of memory formation and retention, though, remains a mystery and may also have both analog and digital components.

Is life itself analog or digital? Freeman Dyson, the world-renowned mathematical physicist who helped found quantum electrodynamics, writes about a long-running discussion with two colleagues as to whether life could survive for ever in a cold expanding universe. Their consensus is that life cannot survive forever if life is digital, but life may survive for ever if it's analog.

What of my original topic – analog vs digital computation? In a book published in 1989, Marian Pour-El and Ian Richards, two mathematicians at the University of Minnesota, proved in a mathematically precise way that analog computers are more powerful than digital computers. They give examples of numbers that are proved to be non-computable with digital computers but are computable with a simple kind of analog computer.

Consider a classical electromagnetic field obeying Maxwell's equations: Pour-El and Richards show that the field can be focused on a point in such a way that the strength of the field at that point is not computable by any digital computer, but it can be measured by a simple analog device.

The book is available for free download. Digital engineers, knock yourself out.

Meanwhile, analog rules supreme. Which, of course, we analog engineers knew all along.

Now, about that raise…..

8 comments on “Analog Is More Important Than Digital – Scientific Proof

  1. Scott Elder
    August 26, 2015

    Paul, I've been around long enough to recall the time when many proclaimed the days of analog engineering are numbered.  It is an interesting turn of events that actually the job of a digital engineer is dissappearing; or at least morphing into a software position rather than hardware.  Soon there will be only circuit engineers and software engineers where the former is dominated by analog types.

    I enjoyed reading your blog.




  2. KenKrechmer
    August 28, 2015

    Mathematically, the issue is discrete (summations) or continuous (integrals) distributions. In physics the smallest division between discrete and continuous distributions is Planck's constant which is very small. So at that level the issue is moot.  At the neuron level information appears discrete.

    In quantum mechanics a distribution may have a third form, superpositions. But the mathematical development of such distributions is weak.  Just a guess, but the brain probably stores information as superpositions.  

  3. Terry.Bollinger
    August 28, 2015

    I would suggest that the natural world is telling us powerfully that both digital and analog are needed to obtain optimal use of information about the world at large… and that at present, we've gone too heavy on digital for certain areas.

    Here's one a rough heuristic for assessing the situation: Digital works best, both in biology and in computers, for preserving information, while analog (currently almost solely in biology) is hugely more efficient for fast, energy-efficient processing of huge quantities of data about an incompletely know outside world.

    There's no better example of how important digital to preserving information in biology than DNA, which is about as binary (or quaternary) as it gets. But also, that part about how neurons fire discretely also shows preservation of information during the transmission process, since pure analog signals would be unreadable after traveling even a short distance across mushy, chemically-mediated neural circuits. Pulse coding works far better!

    Conversely, a tiny fairy gnat with nothing but microwatts of brain power to work with can hunt prey better than drones using millions of times more power. But how? Well, at least in part by making very clever use of hierarchies of simple analog heuristics with lots of real-time feedback. No one of them is very accurate, but when combined in ways that allow certainty to accumulate and focus in on emerging “points of interest,” the result is a very efficient way of processing a lot of information at an almost vanishingly small energy cost.

    We need more of that, and the source of the lessons is going to be the now-burgeoning field of neurobiology and neural cognition.

  4. zeeglen
    August 28, 2015


    You reminded me of a certain young test engineer from several years ago.  I needed him to design and build a test fixture for one of my products that required an internally generated 10Hz sine wave.  After specifying in a roughly detailed block diagram what it had to do, I expected he would use a Wein bridge or phase shift oscillator, but I overlooked the fact that he was a recent college graduate of the digital age.

    He designed the sine generator using a clock oscillator, frequency divider, address counter, eprom containing a sine lookup table, and a D-to-A converter. (sigh)

    He also ignored my advice about heat sinking a load transistor and to use a linear power supply, not a switch mode supply.  The result was it took 5 minutes to reach operating temperature when the hot transistor finally stabilized, and the technicians could not probe the analog DUT with an oscilloscope with all the switchmode noise running around.

  5. Paul Pickering
    August 31, 2015

    Hi Scott,

    Yes, they've been writing analog off for many years. Thankfully, rumors of its death have been greatly exaggerated, as they say.

    The notes from my 1975 college analog design course are just about the only ones that are still relevant. They've outlived the instructor and will probably outlive me, too!

  6. wiliamarthur
    September 1, 2015

    I would suggest that the natural world is telling us powerfully that both digital and analog are needed to obtain optimal use of information about the world at large… and that at present, we've gone too heavy on digital for certain areas.

  7. jimfordbroadcom
    December 2, 2015

    Gee, Paul, you sure opened a can of worms!  Or jumped down a rabbit hole.  Fascinating stuff, indeed, and now I'm going to have a hard time getting anything done here at work for the rest of the day!

  8. uberestimate
    August 23, 2016

    Thanks for such details information. 

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.