Advertisement

Blog

Operations per Joule

It has been estimated that the human brain performs 3.6×1015 synaptic operations per second and, from blood flow and oxygen consumption, consumes 12W. That means it manages 3×1014 operations per Joule, yet it is made up of slow and noisy components.

Digital computers have come a long way. UNIVAC managed 0.015 operations per Joule in 1951. IBM's Blue Gene managed 1.684 x 109 operations per joule in 2010. It has been argued that the human brain is that much more efficient, because it uses the natural physics associated with it, rather than something foreign. This may also imply that the conversion to the digital domain and the computation in that domain is inefficient, and I think few people would disagree with that. It is generally accepted that a processor is the least efficient way to get any particular job done. It just has flexibility on its side.

Therefore, it would seem that analog should be capable of higher levels of computation per unit of energy, because it too can utilize the physical attributes of its environment. But are we making use of this? It would appear that progress is being made, but I wonder if we aren't on the wrong track.

In a 2004 presentation on digitally assisted analog circuits, Bernhard Boser of the University of California, Berkeley, offered these two graphs. The first showed improvements in digital and analog circuitry performance over a 15-year period. It is not entirely clear to me what type of function this is based on, so the figures may be questionable. However, his numbers show a 150x difference in improvements between digital and analog.

He does note that, over that period, analog improved more than digital in terms of power consumption, although analog trailed digital in logic improvement. This again shows how processor architectures got less efficient during this period.

In 1998, Rahul Sarpeshkar of the department of biologic computation of Bell Labs, wrote in a paper that analog computation could be far more efficient than digital computation. Adding two eight-bit numbers would require about 240 transistors in a CMOS digital process, but it would take a single wire and the application of Kirchoff's current law in a analog structure. Similarly, an eight-bit multiplication would consume 3,000 transistors in a digital setup, but a similar multiplication of two currents in an analog setup takes only four transistors. The problem here, as I see it, is that he is comparing things that require absolute precision with things that cannot provide it, even though they would probably produce just as good an answer for most purposes.

Have we set ourselves up for requiring such precision and repeatability that we have got ourselves into a corner? The human brain is probably no more accurate than a typical analog circuit, and we hardly understand the noise sources to which it is subjected, yet we presume that the human mind is way more capable than any machine — even though we sometimes make mistakes. We are so much more powerful than computers that we can add redundancy into the process to try and catch the large errors, and most of the time we ignore the small ones. Ironically, IBM's Watson, which was designed to win on Jeopardy , made a number of errors but still managed to consistently beat its opponents. However, the power taken to do that was immense.

Are we expecting too much precision from analog? If we relaxed that, could we do a lot more for less power?

45 comments on “Operations per Joule

  1. Dirceu
    May 20, 2013

         Very interesting this comparison between the analog and digital worlds, from the perspective of energy efficiency. This makes me wonder, what should be the resolution (in bits) that an ADC should have in order to translate the nuances required by various organs of the human body. The text also reminds me the words from “Rise of the Robots – The Future of Artificial Intelligence”, by Hans Moravec published on SA:

        From long experience working on robot vision systems, I know that similar edge or motion detection, if performed by efficient software, requires the execution of at least 100 computer instructions. Therefore, to accomplish the retina's 10 million detections per second would necessitate at least 1,000 MIPS.

        The entire human brain is about 75,000 times heavier than the 0.02 gram of processing  circuitry in the retina, which implies that it would take, in round numbers, 100 million MIPS (100 trillion instructions per second) to emulate the 1,500-gram human brain. Personal computers in 2008 are just about a match for the 0.1-gram brain of a guppy, but a typical PC would have to be at least 10,000 times more powerful to perform like a human brain.

  2. bjcoppa
    May 20, 2013

    Sandia National Lab is one of the leaders in developing quantum computers. The theory is still not perfectly understood which has held back its progress over the last couple decades. I visited a Sandia qc lab in the late 1990s and the field has not advanced as fast as expected since then. Of course, federal funding cuts have not helped.

  3. Davidled
    May 21, 2013

    HP Lab is doing a lot of researching for computer architecture, networking and mobility, and system. I got a chance to attend HP Lab seminar. Their presentation descries quantum information.  HP has quantum science research lab.

  4. DEREK.KOONCE
    May 21, 2013

    Analog, in of itself, has infinite resolution. Yet I see it always ends up being digitized in some form to explain to people. And from a marketing perspective, touting 12 bits over 10 bits would make people want more. General public seem not to understand the analog world and accept some in accuracies. However, the research science realm strives on precision; imagine working from yocto- to beyond yotta- in either time, distance or mass units.

  5. Brad Albing
    May 21, 2013

    Analoging – quantum computing sounds like a good topic for a blog, altho' I'm not sure if it would fit in on Planet Analog – but let us know if you think it's worth pursuing.

  6. eafpres
    May 22, 2013

    Here are a couple interesting updates on quantum computing.  The boson sampling device is mainly analog conceptually.

    Boson sampling quantum computer

    Google funds quantum computing

     

  7. rfindley
    May 22, 2013

    In the latest issue (June 2013) of Discover magazine, there is an article “Mind in the Machine” that talks about a 1000x improvement in brain simulation capability when the scientists realized that only a small fraction of neurons actually fire at a given time, and thus only 1 in 1000 needed to be simulated per iteration.

    The article says their criteria for selection was “neurons that had recently fired, and were thus most likely to fire again.”  Of course, there must also be some criteria for re-selecting neurons that haven't fired recently, otherwise you end up with a dwindling list.  I suppose a threshold of inputs would trigger the re-awakening of a neuron.  I use a somewhat similar approach in my own work.

    [BTW, thanks to Max for sending me that issue of Discover!]

  8. ChrisCaudle
    May 22, 2013

    “Analog, in of itself, has infinite resolution”

    Not in the real world.  Resolution comes from the word resolve, and the ability to resolve differences between analog quantities is limited by intrinsic thermal noise.  The Shannon limit still applies.

  9. Brad Albing
    May 23, 2013

    It's sometimes a little risky using an article in Discover magazine for source material, but 'tis interesting. Could be a good blog in there that would speak to a sort of analog/digital blend for a computer .

  10. Brad Albing
    May 23, 2013

    I eagerly await two blogs from you on these topics!

  11. Brad Albing
    May 23, 2013

    OK, so maybe we should say, “For all intents and purposes, analog has infinite resolution.”

  12. ChrisCaudle
    May 23, 2013

    so maybe we should say, “For all intents and purposes, analog has infinite resolution.”

     

    No way, that statement is not in any way accurate unless you are working on a system with extremely loose specs. Look at any high quality converter system.  Until you get to very high bandwidths, the digital section of the system has much more resolution than the analog section can support.  A-to-D and D-to-A converters are limited to around 120dB signal to noise ratio, even though you can easily have 144dB resolution on the digital side of the system (24 bits).  Whether you are trying to measure DC values or AC signals up to a few hundred kHz the values you measure are going to be inaccurate due to DC offsets and thermal noise.  All analog effects that are fundamental effects of the physics of devices, not limited by the digital architecture.

  13. WKetel
    May 23, 2013

    Analog systems do have nearly infinite resolution, or at least much better than we need. The problems and limitations hare where it goes into and out of the digital realm. The quantizing error of the converters is where the problem lies, and the constant question is how many bits of resolution and accuracy are needed for a specific task. It winds up being a compromise, as most things are compromises, and the result is that somebody has to actually think th entire process through.

  14. WKetel
    May 23, 2013

    The explanation that I have for how the brain can do so much with so little power has much more to do with the software than with the processor speed. The brain's “machine code” is so very much more efficient that it is an entirely different concept in function, not just a little bit different. The brain's operating system is so very different that those folks at microsopht should not even waste their time attempting to think about it. 

  15. Brad Albing
    May 24, 2013

    >>The brain's “machine code” is so very much more efficient that it is an entirely different concept in function…

    Can you expand upon this any? It may go beyond an analog discussion, but it would be interesting to know.

  16. BrianBailey
    May 24, 2013

    @Brad – I can see what you are saying in the theoretical space. If we assume that noise doesn't exist for a moment, then analog is not bound by the quantization problem. Digital has finite resolution, but does not suffer from noise. However, the realities are that noise does exist and in many cases the specs for the A2Ds are set by the amount of noise that exists in the environment it has to operate in. There is never a point trying to get more resolution out of a signal than is really there.

  17. Brad Albing
    May 24, 2013

    OK – that explanation makes good sense. Thanks.

  18. WKetel
    May 26, 2013

    Brad, if I understood enough about the brains operations to explain how it works adequately, I would be doing something other than engineering, to the great benefit of all. But what we can see is that, for starters, the brain learns, which computers don't learn, they only sort data. Also, computers have data, while humans have both knowledge and insight, which are distilled from data. Besides all of that, which is quite a bit, human brains are able to process in a variable and non-linear manner, while current computers chug through a set sequence of things. A slight bit like fuzzy logic, only without all of the lies and dumb examples, and actually able to evaluate information instead of just evaluating data. 

    (I have posted quite a few times about the relationship between data, information, knowledge, and insight.)

    That is the other difference is that mostly the human mind is able to gain knowledge and insight without somebody needing to write the code to tell it how.

  19. Netcrawl
    May 26, 2013

    @Wketel I agree with you, therea big difference between a human's brain and a computer- in learning processes, computer don't learn, they're programmed to work on specific task(GIGO- garbage in, garbage -out thing), if you programmed it exceute a certain task they going to follow that even if its wrong, the computers huge advantages is in its processing ability- the power to sort huge volume of data, they possesed only data while human brain has all the “total package” need to execute a cetain task effectively, computers works are human-dependable, they need human interaction to execute something.   

  20. Davidled
    May 26, 2013

    Fuzzy brain has a lot of cell storing the information.  Nervous system might be modeling by analog and digital domain. Our technology has a limitation to simulate a complete human brain. A chip might have more transistors in the high density domain. For example, some IC vendor changes transistor structure into 3D form to add more inside chip, with a fast processing.  Therefore, algorithm related to human thinking might be potential implemented in advanced chip.

  21. SunitaT
    May 27, 2013

    @Brain, thanks for the post. No doubt we have seen drastic shrink in the technology over the years. While this technolgy shrink benefits both analog and digital circuits alike, overall analog circuit performance is compromised by other trends such as reduced supply voltages.

  22. BrianBailey
    May 27, 2013

    I totally agree. The processes reductions were, for a while, all for digitals favor but that is changing as well these days and further reductions, while necessary for cost reduction, are making digital design a lot more complicated as well and breaking down many of the abstractions that have given digital its productivity gains.

  23. amrutah
    May 27, 2013

    “the application of Kirchoff's current law in a analog structure…”

      As the process technology are shrinking, the leakage currents are of higher order and hence applying KCL for getting operations might pose tricky problems. But if the GaAs wafers and process technology improves (which is supposed to have low leakage) then having KCL kind of operations help.

  24. WKetel
    May 27, 2013

    Brian, it seems that you are quite right. The step to 28nm technology has about doubled the needed effort, and the step to 14nm will double the effort again, and may never deliver adequate production yields. So once again I pose the question as to “wouldn't it be easier just to write more efficient code?” 

    Besides all of that, the cost is small enough currently that there is not any really good reason to reduce the price of processors any more. 

  25. WKetel
    May 27, 2013

    Netcrawl, you have already stated the balance of the reason: the computers can only sort the data, so the resulting product is sorted data, not knowledge , insight, or understanding. That is the balance of the explanation of the very fundamental differeence between how humans think and how computers process.

    Now, in spite of this very major difference there are still a few folks who want to create actual intelligent thinking computers, or even robots. They don't understand that such a system would embody all of the emotional issues of the programmers, which would guarantee that the package would not be able to co-exist with normal humans. As long as such a system remained in isolation that might not be a problem, but if it became linked to the real world t could produce a disaster.

  26. BrianBailey
    May 27, 2013

    It is not just the price of the processor that we are trying to minimize anymore: it is the processor plus the memory plus the peripherals plus the graphics engines and wireless engines etc etc. It is by putting all of these onto the same chip that you get the cost reduction and that is also why the increasing pressure (or at least one of the reasons) why analog is being brought into the chip. It is actually amazing how little of the digital circuitry is actually designed from scratch for each chip – it is almost all becoming back-end effort plus front-end architecture.

  27. BrianBailey
    May 27, 2013

    Oh and another thought – the day of the RTL digital engineer has past. There will be fewer and fewer jobs for them.

  28. WKetel
    May 27, 2013

    Brian, I see, the reason for putting it ALL on one chip is so that those dozens of useless features can be added cheaper. Now it becomes clear that the big thing is about sticking on all of those features. What are the development costs for one of those large devices? Without giving away any company secrets, of course. My guess is that the cost of developing one of those custom chips that actually works as intended is quite high. And how many product units are produced? 10,000, or is it more like 250,000? I really wonder.

    While standard parts would have a greater parts cost, the development cost for a standard chip is zero, and the production yield is generally high enough to give you as many as you order. And sometimes they are available from multiple sources. Of course in my industry the production runs are much smaller and so developing a custom chip would be totally unwise, as far as costs go. But I always did wonder about what point the custom device does become a better choice. And do you always reach that number of production units befor the product becomes obsolete?

  29. Brad Albing
    May 28, 2013

    OK – thanks for expanding on that a bit. And yes, if we did understand more, we could be neuroscientists instead of engineers.

  30. Brad Albing
    May 28, 2013

    @Brian, to your point that the processes reductions were… all for digitals favor but that is changing as well these days and further reductions… are making digital design a lot more complicated. This ties back to other discussions we've had on integrated analog devices – the difficulty of using processes that are better suited for digital to do analog functionality.

  31. BrianBailey
    May 28, 2013

    You are totally correct. You would not do a custom chip for 10,000 units unless of course you could sell them for a huge amount of money. Every chip will have a different cost depending on many factors but you can expect it to be in the 10s of millions of dollars.

  32. BrianBailey
    May 28, 2013

    My next post will talk about a promising way to mix processes. You must have read my mind – question is what technology did you use for that…

  33. Brad Albing
    May 28, 2013

    BB – even tho' this moves beyond our charter, I'm curious why you think the day of the RTL digital engineer has past. Juat wondering….

  34. Brad Albing
    May 28, 2013

    Well, as an editor, I am able to see into the future and read the minds of bloggers. The actual method by which I do this is of course classified. But here's a clue that may allow some insight: “These are not the 'droids you're looking for.”

  35. Scott Elder
    May 28, 2013

    @Brian, Re: RTL is Dead…..If your claim is that writing System Verilog or System C is sufficiently silicon efficient and power efficient (when compared to RTL) , then you may be right.  But I'm not sure that is the case.

    Isn't it similar to FPGAs?  FPGAs are design/production time efficient, but not silicon or power efficient.  As a consequence, the FPGA market is still only a fraction of the total digital IC market.

    Seems to me that at one end you have FPGAs (time to market), the other end RTL (cheapest silicon and most power efficient) , and then lurking in the middle is System Verilog and System C.

     

  36. BrianBailey
    May 28, 2013

    So little of the logic in a digital chip is designed from scratch. 90% of the chip area is consumed by IP blocks that are from third parties or reused from previous designs. They have been optimized and tested at the RTL level. They may still need some back end optimizations, but the RTL guy doesn't do this. Similarly, the RTL guy doesn't assemble all of the blocks together at the system level, or make the decisions at that level. So, the traditional RTL engineer is left working on 10% of the chip. While 10% of a huge number is still large, it is not growing in terms of complexity and other techniques are emerging that will enable some of that to come from higher-levels of abstraction where even greate power optimizations and savings can be made.

  37. WKetel
    May 28, 2013

    Thereis the big diffeence, which is that in the industrial equipment market a run of fiftynunits is huge, most production runs are five to ten systems. So I have a different point of view as a result.

  38. BrianBailey
    May 29, 2013

    Right. The “cutting edge” is different for different markets and applications. While in the consumer space cost reduction is the number one issue and that requires massive integration, in other areas it may be flexibility or ultra-low power.

  39. WKetel
    May 29, 2013

    Brian, in the areas where I do most of the design work the number one issue is reliability, including stability of adjustment and accuracy. After that comes servicability and longevity. So those two goal areas are just about the opposite of the typical consumer product. Some product users find that downtime and failure are so much more costly that the product cost moves quite a way down the list. All of their other requirements get summed up as “quality”, if a one word summary could ever be adequate.

    But to the typical purveyer of consumer products, quality is equated to the number of features. So there is a serious language barrier here. 

  40. BrianBailey
    May 29, 2013

    And lets not even start to talk about the language barriers between the hardware and software folks. It wasn't that many years ago when the two groups didn't even know where the other group was located let alone know anyone in it.

  41. DEREK.KOONCE
    May 29, 2013

    Oh, and the finger pointing that went on as the hardware folks say it is a software problem. And the software folks say it is a hardware problem. I have a good friend that knew both sides and when he stepped in to point, he was right every time.

  42. WKetel
    May 29, 2013

    During the period when we were having our control software written by an outside contarct house, we had a very good arrangement as far as finding the actual source of problems that could have been either hardware or code. When a problem was discovered we would write a detailed description that listed everything we knew about the problem, and pass that to the software folks. Our agreement was that whoever found the cause would immediately contact the other side and let them know what it was. This worked very well and allowed diagnostics to proceed rapidly. Then we got started doing programming in-house, and used the same approach, except that communications became even faster and more convenient.

    How else would a sensible organization handle the challenge?

  43. Brad Albing
    May 29, 2013

    @Brian – exactly so. When I worked for a company that made igniters for gas fired appliances, our criteria was substantially differnt than when I worked for a company that made CT scanners.

  44. Brad Albing
    May 29, 2013

    I know the GaAs devices can operate at higher applied voltages (and at higher frequencies), but do they have lower leakage specs at supply voltages of 0.9V to 3.3V?

  45. StephenGiderson
    September 17, 2018

    I did not know that is really just how much work our body is running on but nevertheless, nature never fails to amaze me. It is still as remarkable as I have read about on several other findings kindly shared by enthusiasts like yourself as well. It gets even more interesting as the day goes by when someone comes across a fact or two that are too good to be missed. Reading is definitely key in obtaining information that you would have otherwise simply left out.

Leave a Reply