Will Quantum Computing Enhance Analog Design? Part 2

In part 1, we looked at some preliminary concepts of what quantum computing is and how it compares with analog computing. Let's continue the analysis.

I contacted Dr. Ned Allen, chief scientist at Lockheed Martin, to discuss the relationship between quantum computing and analog computers. Lockheed Martin purchased the first commercial D-Wave One computer in 2012. In a Lockheed Martin video, Allen says quantum computing “might be considered the rebirth maybe of analog computing.” In an email, I asked him to elaborate. In his reply, he described the difference between quantum computing and digital computing with a reference to “analogue reckoning.”

Rather than holding discrete values of 0 or 1 the way transistors do, qubits in superposition (a quantum mechanical term meaning multiple states being simultaneously possible) can simultaneously have a value of 0, 1, and all values in between. Thus, qubits are more analog in nature than transistors.

I should note that the École Polytechnique Fédérale De Lausanne (EPFL) says it is working to develop transistors using quantum tunneling, along with IBM Zurich and Cea-Leti. The EPFL said in a press release that these devices will be commercialized by 2017 and will lower transistor power consumption by 100 times from present levels. Unfortunately, such chips still won't be any better at solving the problems for which quantum computers show talent.

Very few head-to-head performance comparisons exist between quantum and digital computers. This is partly because there are very few quantum computers of significant size, but it is also a challenge to devise problems where solutions can be properly compared. In a paper titled “Experimental Evaluation of an Adiabatic Quantum System for Combinatorial Optimization“, Catherine McGeoch of Amherst College and Cong Wang of Simon Fraser University reported a number of tests between a D-Wave Two quantum computer and a digital multi-processor computer with seven Intel quad-core processors. They reported that the quantum computer performed about 3,600 times better, using 439 working qubits. Colin Williams, director of business development and strategic partnerships at D-Wave, told the New York Times about the results: “For most problems, [the quantum computer] was 11,000 times faster, but in the more difficult 50 percent, it was 33,000 times faster.”

This kind of performance encouraged Google to initiate what it is calling the Quantum Artificial Intelligence Lab. The company will foot the bill for a D-Wave Two computer to be housed at the NASA Ames Research facility and managed by the Universities Space Research Association. Google said in a blog post on this topic that the goal is to “study how quantum computing might advance machine learning”. The company also said it has a general interest in the “highly difficult” field of machine learning.

It is interesting to consider what might be done with future quantum computers if they become more widespread. Synaptic Laboratories has been considering the impact of quantum computing on data security for several years. The Maltese company says on its website that many cryptographic techniques employed today, particularly the public key infrastructure (PKI), will be quite easy for powerful quantum computers to break.

Benjamin Gittins, CTO of Synaptic, told me in an email, “One thing we do know is that fundamental flaws (see this paper) already exist in the global PKI model.” Nevertheless, “it is still unclear how many qubits might be needed in a D-Wave computer to break public key crypto in practice.”

Perhaps of more relevance to this forum are the possibilities to automate validation processes, such as validating an analog IC design or simulating a mixed-signal IC and validating firmware for it. There are very good design tools available now, but a quantum computer might be better at testing a large range of input conditions, and it might handle the analog parts more naturally. This is speculation, but I can imagine a validation system using a quantum computer that would avoid number-of-bits precision limitations during a simulation of an analog IC. Perhaps the ultimate in analog design will be enabled by advances in quantum computing.

Related posts:

10 comments on “Will Quantum Computing Enhance Analog Design? Part 2

  1. Scott Elder
    June 4, 2013

    One aspect of quantum computing that mimics analog is that claims are not easily verified. Who is to say that analog circuit A is better than circuit B?  We can't even get audiophiles on the same page when it comes to whether gold plated conductors produce better sound.  I think we'll know the outcome when many more customers buy quantum computers.

    If Google buys one, Lockheed buys one, that's okay, but that's not validation.  Those companies view these purchases as staying in the game, staying in the know, which is important.  But its not validation.




  2. eafpres
    June 4, 2013

    I would think of Google etc. as a leading indicator. You are right about validation. There is a lot of controversy and heated debate on whether quantum computing is real. In addition there are those that say D-Wave's computer isn't actually a quantum computer. I think in time the validation will get worked out for quantum computers, unless the whole thing goes away. Do you have any favorite “which is better” stories for analog electronics? I like “tubes vs SS”.

  3. Scott Elder
    June 4, 2013

    Hi Blaine- I'm not hung up on what D-Wave wants to call their computer or how it actually works.  I guess those things bother physicists because of the implications behind the terms.  But, the best argument I've seen made against D-Wave's claims is that their success metrics are based upon how fast they solve problems that are optimized for their hardware when compared against another piece of hardware (Intel processors) that is designed to solve ALL types of problems.

    D-Wave has been at this for nearly 15 years.  If I asked a digital engineer to go design a digital circuit to compute the D-Wave specific problem, I seriously doubt the comparison ratio would have been 30,000x and it surely wouldn't have cost $100 million dollars to design.  A simple and-gate can input, compute and propagate a result in pico-seconds.  I think the clock cycle of an Intel CPU is a few hundred pico-seconds.  So does that mean that I should claim a 1000x speed up in and-function computing?

    I used to use an IC layout tool that was lightning fast on redraw.  One could zoom in and out and pan all around faster than you could see even with 100 Million transistors in the design  Turns out the team that wrote the software coded the compute intensive algorithms in assembly code!!!  

    But let's not let the wind out of their sails.  One thing is certain.  Everyone is learning more.  When the insiders give up, then we'll know the truth.

  4. Scott Elder
    June 4, 2013

    “Which is better discussions.”  I only like the ones where the definition of success is agreed upon upfront.  Otherwise it is like discussing politics and religion.

  5. eafpres
    June 4, 2013

    Hi Scott–you certainly have company on questioning the metrics.  If you would like to read some (scathing) counter arguments and (even more scathing) replies take a look at this blog:

    Shetl-Optimized Blog

    One way to look at this, if you stay out of the existence and other physics arguments, is that 15 years and $100M is chump change compared to the money that has gone into Nuclear Fusion.  Yet I'm sure it can be done, and one day may be the dominant energy source for the planet.

    Closer to home in the physics community is string theory.  A few decades and untold billions later, some physicists have started to cry wolf.  Some are even questioning if math is the real answer (!).  

    Quantum computing could fall anywhere within the space of the above two examples, or turn out much better.

  6. eafpres
    June 4, 2013

    Scott–agree that metrics/scoring needs to be well defined up front.  Do you have any analog electronics examples though?  Seems like a fun topic.

    To your previous point, I see the tube vs. SS arguments often using adjectives that I have to look up in the dictionary.  In fact, the entire audio media community seems infected with some kind of hypervocabularism.

  7. Scott Elder
    June 4, 2013

    Which is Better:

    Here's one–data converters interfacing with sensors.  SAR vs. Delta-Sigma.

    SAR is low powered, but needs a preamp to get the signal into the dyamic range of the SAR.  And it also needs a more agressive anti alias filter.

    Delta-Sigma is high powered, but needs less of a preamp (maybe none) and a much simpler anti-alias filter.

    Seems like lots of issues to contend with like CMRR with power line noise, cost, in addtion to the issues listed above.

    I think the world of data converters has gotten so complicated that I wonder how many applications really are using the appropriate solution (lowest cost at lowest power with adequate performance).  This is in line with Bill S. comments of a while back where he questioned whether an engineer does a sufficiently thorough job assessing the real requirements of each component.  How much money is overspent to just swamp the problem with precision?

    ….and then the topic of Neural Networks was brought up again last week as an alternative to deterministic signal processing.  But my mind is still quite closed on that topic for all of the reasons I listed in that discussion.

  8. amrutah
    June 8, 2013


       Thanks for the post and the “critic” link you posted in the comments.  It helps understand this topic of “Quantum Computers”.  But at times, it leaves me still in the confused state about this technology.

      I have to say that this technology is in its nascent stage, the algorithms used to solve the problems to determine its speed might vary but the hardware is not a false.  It is still based on the laws of physics and quantum theory.


  9. amrutah
    June 8, 2013

     “can simultaneously have a value of 0, 1, and all values in between”

        I have read that currently the latest version of the quantum computers have a 512 level qubits and there are plans to increase it to 2048 levels which will lead to higher computational speeds.


  10. eafpres
    June 8, 2013

    Hi amrutah–Actually, the D-Wave 2 has 512 qubits in the “processor”, and they have plans to upgrade to 2048 qubits.  This is different than the number of levels per bit.  However, more qubits are important to tackle larger problems.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.