{* siteName *}
By clicking "Create Account", you confirm that you accept our terms of service and have read and understand privacy policy.
{* /registrationForm *}

I'm sure nearly everyone has by now heard that Google "proved" the D-Wave 2 they operate jointly with NASA (mainly paid for by Google) can operate "up to ∼ 10^{8} times faster". If not, you might want to have a quick read of their paper, titled "What is the Computational Value of Finite Range Tunneling?" Even if you don't read the paper, you can be excused for wondering if "the race to a real quantum computer" (my words) is over. There is enough press around every such announcement from Google, IBM, and others that you easily can get the impression it is just down to the details now.

In fact, there are still major, fundamental challenges to be overcome and building blocks to be created. You might have heard earlier last year how IBM had solved the problem of quantum error correction. Although there was a lot of buzz about that, it is interesting to note that a month earlier, Google reported on their own blog that they had already done more or less the same thing. And so it goes. But the near-term reality could be even more interesting, especially if you are old or lucky enough to have played with analog computing.

Way back in 2013, I wrote a series of posts on the possible role of quantum computing in analog design. (see: Will Quantum Computing Enhance Analog Design? Part 1 Parts 1-3) In the introduction to that series, I provided the following diagram:

Figure 1

An analog problem.

Consider if you were given the problem of making a table, sort of like a train schedule, to represent flow out of the 5 outlets versus various combinations of the inlets. To solve that problem in digital space requires a pretty good fluid dynamics simulation tool, an accurate physical model of the system translated into the input for the tool, and a fast computer. It is very doable, but pretty expensive.

Now suppose we change the problem to determine which input has the most impact on the output of the 3rd pipe from the left? You could solve that by going to the hardware store and plumbing up a simulator, which would be a version of an analog computer. Or, you could build an analog computer, and come up with models for the fittings (combinations of L, R, C circuits) then play with it. The result for any given set of inputs, once the analog computer is in hand and "programmed', is available nearly instantly, while each run of the digital simulation might take minutes or hours or days. If you think about the fittings modeled as L-R-C circuits, they capture the dynamics as well as the steady state behavior. The dynamic part can be thought of as a simulator node capable of having many possible values. And that sounds something like a qubit.

Re: IS QUANTUM COMPUTING EQUATABLE TO ANALOG COMPUTING?

Thanks for the interesting thoughts on noise and accuracy. There are, as I understand it techniques exist to eliminate errors in digital processing. That works well for distinguishing 1s from 0s. But as you point out, that may not work or even make sense in quantum or analog computing. Both Google and IBM say they have figured out quantum error correction. I must admit I don't understand how that works.

IS QUANTUM COMPUTING EQUATABLE TO ANALOG COMPUTING?

This op ed piece is commendable for providing many very interesting references to recent work in the broad - and exponentially widening - field of quantum computing. I confess to being quite unable to keep abreast of these developments. However, as for the suggested quasi-equivalence of quantum and analog computing, I feel this might be more than a bit misleading as to the important ways in which these differ.

Nevertheless, there are some intriguing similarities. Consider, for example, a fully-analog circuit of almost any complexity, existing as, let's say, a breadboard. [Young folks: this is a circuit built up from a selection of primitive elements place on the "board" – sans bread – and wired by hand. Primitives range from such basic elements as R, L and C, through discrete transistors, up to analog ICs]. Let's omit for now any energy-storage elements – principally L's and C's, but there are, of course, others – having significant effect on the circuit's time-domain behaviour. One could refine all these definitions.

But here's the similarity: When the primary source of [DC] power is applied, the circuit instantly "solves" the equations for all the elements, both individually and also as an interactive ensemble. It does not need to be "told" about the physics of these elements, using modeling equations, because the circuit is itself a piece of the real, physical world. In this respect, it is an "analog" of nothing. Likewise, the circuit's response to any other stimuli applied to it will likewise occur without any recourse to some sort of computation. In this respect it IS its "own analog computer".

Now, we may question the absolute accuracy of any "result' – say an output voltage of a simple amplifier – of this tangle. Whatever solution it happens to settle on – by the rapid conversation that occurs when any stimulus is altered – is its own "private" solution. The matter of providing accuracy never occurs to an analog circuit. It might be quite accurate – enough for the ensemble to be used as a component in a useful [that is, high-precision] Analog Computer; but, as far as the circuit is concerned, it is simply solving its internal equations, whose accuracy in some absolute sense can range from poor to excellent.

We may yet find another similarity here to quantum computing. While small ensemble of quantum elements – a few qubitsworth - may fortuitously appear to behave quite well, and very, very fast in operation, I have a hunch that as they become larger in extent – to employing hundreds of elements – they will be prone to [at least one of] the same enemies as analog computing, namely, universal noise. Operation at low temperatures may squelch basic qunoise; but some types of circuit error - those attributable to shot noise for example, which is not temperature dependent [at least, as generally explained and modelled, although, as an aside, there may be some errors of rigour here, as far as real devices are concerned], and even some fancy variety of 1/f noise - may eventually show up in large quantum computers yet to be designed and built.

I've written before about the impact the Internet of Things (IoT) will have on electronics sales including the special role played by analog devices in IoT.

Unless you live in a cave without any media access (which, I suppose, would preclude your reading this blog) you have heard about the Internet of Things (IoT).

To save this item to your list of favorite Planet Analog content so you can find it later in your Profile page, click the "Save It" button next to the item.

If you found this interesting or useful, please use the links to the services below to share it with other readers. You will need a free account with each service to share an item via that service.