In my last blog, Operations per Joule, I talked about digital, analog, and the brain — comparing the energy efficiency of each and questioning the approach we take to solve problems.
The brain gets all of its advantage by working with physics rather than using some foreign mechanism to capture and process information. We don't yet know how it does all of it, but there is one aspect of the brain that is very clear: It relies on large amounts of slow communications rather than concentrating communications into a few very fast channels. The systems we design are based on how fast we can make communications, attempting to optimize either the throughput or latency.
We talk about signal propagation times in a computer in the nanosecond range, while for the brain it is in the millisecond range. It would appear that computers have a 106 speed advantage, and yet we cannot even come close to the total processing power.
Now, I want to tread on shaky ground for a moment and I hope nobody takes offense. Haier and Jung are neuroscientists who have been studying the brain for a long time. In a 2005 report they determined that men tend to have more gray matter while women have more white matter. Gray matter loosely equates to processing function while white matter equates to communications matter. The fact that both men and women seem to have similar mental abilities would imply that the two are partially interchangeable in that they both provide a certain level of capability.
In the digital world, there is a very clear distinction between what is processing and what is communications. We have, in fact, even built all of the programming paradigms around the fact that computation is faster than communications, so we attempt to minimize the latter as much as we can. We also separate memory from computation and cluster, primarily for practical fabrication reasons, but this inadvertently puts communications in the way of computation.
So while we have managed to create communications that are much faster than what happens in the brain, we have not managed to use them to our advantage. In the analog world, we are perhaps closer to the brain model in the way that computation and communications are not really that separate. Wires connect components in a network and can even perform some aspects of the computation itself. In the previous blog I used Kirchhoff's current law to demonstrate how a wire can perform an addition.
What we don't seem to have worked out is how to deal with noise in the analog world. It is generally accepted that the brain is very noisy and uses redundancy, either within or externally (multiple people thinking about a problem). It's as if one mind can never really be trusted, so multiple minds can compensate for it. It is not clear if the brain uses averaging or if it excludes the outliers through some form of threshold, but one thing is clear — it does not define noise as the number of bits of resolution that can be placed into a digital flow; it is more about how useful the result is for an intended function.
It is possible that the brain has a better manufacturing system than our chips and that this process yields more consistent performance in each of the cells, but there are techniques such as digitally assisted analog that can be used to compensate for some of this and that will be the subject of some future blogs. But for now I will leave you with a question: Did the introduction of digital into the processing chain make life more difficult for analog design? Would certain things be easier if they went back to being all analog?
Related posts:
Interesting points on the men's versus women's brains. I remember when digital started in the power supply world. Issue came up for response time of the feedback loop. There are still questions about the benefits of digital power supplies versus step response. However, a good mix of the two can provide good overall performance by adjusting the parameters to even out the efficiency curve over the dc load range. With a good control loop, the efficiency at extremely low and high loads can near the efficiency at the optimum load simply by adjusting the frequency of operation. A regular power supply does not have this luxury since the circuitry within either operate in a burst mode or pulse-skipping mode.
Derek – so is there an implied correlation with power supplies and gender? I.e., are digital control loop-based supplies male and analog loop version are female? Just wondering. It might explain a lot.
Not touching that comment Brad. The wife would cut me off for a few weeks. 🙂
Even before deciding which one is better?
What I want to know from the knowledgeable people here is how does 'Chaos' AKA Non-Linear-Dynamics fit in to the functioning of the brain? I've seen little mention of this over the years.
For example see http://www.rochester.edu/news/show.php?id=2683 “Mysterious 'Neural Noise' Primes Brain for Peak Performance”
Are we to fast to try to get rid of noise rather than to use it?
Could the brain be using Strange Attractors?
In the Analog world, out of band 'noise' in the form of Dither might be deliberately added into a A/D to improve the Spurious-free dynamic range (SFDR),or in the case of simulated-annealing random states of energy are added to force a decision over a threshold.
http://www.analog.com/library/analogDialogue/archives/40-02/adc_noise.html
Maybe we should embrace the world of noise, and use it, rather than always fighting it…
An obscure area of brain study is in the use of Extrema Theory. That is, it is the points in a wave where the direction changes. In a sine-wave it would be the peek and valley points. The time between the points is what is important. For example in speech the intelligence is in this time-domain, where the identity of the speaker is in the frequency-domain.
http://www.edn.com/design/analog/4352503/EDN–03-03-94-Ratio-detection-precisely-characterizes-signals-amplitude-and-frequenc
“Straightforward models of how humans perceive sound and color lead to amazingly simple techniques that use Gaussian filters and ratio detectors to extract signals' information content.”
See this book for more: http://www.amazon.com/Human-Machine-Intelligence-Evolutionary-View/dp/0882479563
Wow – you have certainly given me a lot of reading to do, but I appreciate it. Fascinating subject and I agree with you that sometimes we are way too eager to try and eliminate something when it may actually be helpful to us in the long run.
I like the term “Digitally assisted analog”. It conveys the fact that both have their strengths and when combined, they make a far superior process or function than they would indivdually.
@Steve, I still don't get it why on Earth they named it “Digitally assisted analog”, it sound quite confusing. Yes its a far more superior process but the term is quite confusing.
It is quite elitist – should it be enhanced analog with digital?
@Brian, thanks for the post. I am curious to know if the concept behind this Digitally Assisted Analog design is something similar to Mixed signal design and dowe encounter the same challenges as in Mixed Signal design ?
I will most certainly talk about the challenges, but this is not mixed-signal design as you know it. The two parts are intimitaely tied together. The digital part is used to tune, calibrate or somehow bring the analog part into a near perfect operating point.
And just slightly related – I wonder if for women's brains, the poles are located mostly in the right-hand plane. Seems likely.
I'm really not sure who coined it, but it is quite widely used in the industry, IEEE papers, etc. Stanford University has written much using this term and there is even a book published in 2011 by Springer publishing entitled “Digitally-Assisted Analog and RF CMOS Circuit Design for Software-Defined Radio”
I like it because to me it says that the circuit is essentially analog, but enhanced by some digital technique. I guess it started with ADC calibration, PA and DAC pre-distortion, mostly digital PLLs and such.
I can not believe that a world analagico get better performance considering the new technological possibilities exist today.
In fact, we have a process where one depends on the other, as “men and women” is a cycle, digital and analog.
We have a power much greater development as well, say the cooperation.
I'm not a brain scientist (or a rocket scientist for that matter) but I thought that a lot of the work on neural networks was based on information being stored in patterns of signals in a highly meshed network configuration. I never had the impression the waveform mattered much–it was which nodes (neurons) fired at which times. My sense was that amplitude might matter-if a larger pulse arrives at a node it increases the chances it sends something on, but I also have always tacitly assumed there is some kind of accumulator so a neuron can see if it gets multple hits before deciding to fire. In addition, I thought that thresholds were adjustable which constitutes part of learning.
My simplistic view always seemed more akin to an analog method, and the digital simulations were at a huge disadvantage by having to not only have every node represented but have heuristics (code) for each node that is unique and has some learning functionality. That is a lot of overhead vs. the actual neural network in a brain.
So, I guess my comment about RHPs and stability will be completely off topic (in spite of being quite clever).
>>I also have always tacitly assumed there is some kind of accumulator so a neuron can see if it gets multple hits before deciding to fire. – Probably so, but don't think in terms of a digital accumulator, but instead chemically. So it can accumulate info quickly or massivly or all at once – depending on how you want to think of it.
I am no brain expert either, but I know it is not digtal. My point was that digital does not use the science that it has to play with whereas the brain clearly does – in its chemical/electrical. In the electronics world, only analog has the opportunity to use physics.
@Brad–I agree, it has to be more of an analog acculator and the response might depend on both the acculmated level and the rate.
Which reminds me – didn't somebody work on some sort of an analog computer along these lines? Not a digital one that mimics this sort of analog functionality – maybe “fuzzy logic” if it was done in a non-digital manner. This analog accumulator seems familiar.
Hi Brad–here are a couple of mind-bending pages more or less on that topic:
BINDS Lab Analog Neural Networks
BINDS Lab Super Turing Computation
Thanks – I'll study these and report back – or blog about neral-chemical based analog computers.
So here I am with two computers. The computer on the left is where I use a trackball left handed. The computer on the right is where I use a mouse right handed. Now, it only I can devote one eye to each screen.
@Brad: “… I wonder if for women's brains, the poles are located mostly in the right-hand plane. Seems likely .”
Just complementing: In the digital version, the men have their poles inside the unit circle. Funny
Edward Lorentz summarized Chaos as follows:
“Chaos: When the present determines the future, but the approximate present does not approximately determine the future.” –Edward Lorentz
I love that description for its clarity, but it belies the underlying reality — that 'chaos' is just a label we apply to problems that we can't adequately model at the moment… usually either because our model is too simple or not suitable, or because our data it inadequate. Once we understand how something works, we tend to stop calling it 'chaotic'.
The Artificial Intelligence community suffers from the same problem. Something is called intelligent until we understand how it works. Then it becomes 'deterministic' — 'clever', perhaps, but not 'intelligent'. Interestingly, many people intuitively say that a machine can never 'feel' because it lacks that certain 'je ne sais quoi ' (French for 'I don't know what'). But if we someday discover how 'feeling' works, will we then be just as unfeeling as the machines, or will the machines gain a newfound respect?
So, to answer the question, “How does chaos fit into the functioning of the brain?”: For now, very well. But in the future, not at all 😉
I'm sure that is wholly unsatisfying, so allow me to offer this:
I wholly agree that the 'noise' is not noise at all. It is a deterministic (with enough state data) competition of low-strength signals in a feed-forward and feedback network of schmitt-triggered monostable multivibrators. When someone wins the competition, you see low-frequency oscillations that help suppress the whining losers, while synchronizing a group of related winners into an ad-hoc special-interest group that exchanges contact information for future use.
How's that for a mixed metaphor?
I see. So, you're ambidextrous. I'd give my right arm to be ambidextrous.
Brad, here are all the details
My new home office
My new home office: Update 1
My new home office: Update 2
My new home office: Update 3
Have fun.
>I'd give my right arm to be ambidextrous.
That would make you monodextrous. No wait, you already are, even with two arms.
You could then become the one-armed bandit, or one arm better than Nessarose (From Wicked, the book).