Advertisement

Blog

Neuronics: Long-Term Memory, Part 2

In the previous part of this blog (Neuronics: Long-Term Memory, Part 1), we were exploring the basis for the development of a technology that could store analog quantity indefinitely. We looked at squaring and multitanh circuits.

A somewhat different inverse-function technique is applied to the multitanh circuit by driving the input of the multitanh function directly with the circuit input. Then as vi increases, vo will reach the first maximum, the peak of the first sine cycle. As vi increases further, vo decreases, reaches a minimum, and increases again until it reaches the second peak and descends again. A reference voltage source at the sine center voltage can activate feedback circuits at each cycle and send back positive feedback to hold the multilevel hysteresis circuit at a given cycle.

The result is not a function but something more like a hysteresis loop, except that there are more than two stable points for a given input, as shown below. Consequently, more than one bit of information can be stored by such a circuit indefinitely.

Recall from the previous part, we were looking at the generation of a repeating sine function that could be extended for multiple cycles.

We also considered a square root circuit that could rotate graphical representation of the function by 90°.

Continuing, the inverse function has five distinct values for a given input voltage. Which state corresponds to the output voltage depends on the history of the input. The even-numbered nodes are stable because the inverse function has a negative slope at their zero-crossings, and would correct any deviations from the stable value at the zero-crossing. (This is analogous to synchronous motor stability.) Because these “nodes” are stable, output values can be retained indefinitely.

From Barrie’s research on multitanh circuits in the past, it appears that the number of cycles of such a function can be extended indefinitely so that the information content of a multitanh hysteretic switch depends only on how many BJTs, I's , and R's are added to the feedback circuit. Such a scheme could employ a million BJTs, operating at low currents. The lower current limit is determined mainly by the low-level injection characteristics of the BJTs. This can be as low as a fraction of a nanoampere. MOSFETs have different characteristics, though the same kind of multivalued hysteretic circuits are also a possibility for them. Thus, ultra-large-scale integration of neuronic BJT circuits might be viable.

For 262,144 = 218 BJTs, and using 6 bit switches, then the number of synaptic weights that can be stored on one chip is about 218 – 6 = 212 = 4096. With 8 weights per neuron, this results in 512 neurons. This is enough for some significant neuronics applications. This memory scheme ostensibly achieves the indefinite retention of a state voltage value for a weighting function. It is not the only possibility for analog storage. The memory is volatile (as is organic memory to a significant extent).

Another scheme that avoids the dissipative discharge of hold capacitors stores information in nonlinear oscillators that are described in the phase-plane by (closed and stable) limit cycles. Because they are nonlinear, if a particular parameter is set right, an initial voltage value can cause frequency bifurcation of the limit cycle so that it is possible to select one of multiple oscillation frequencies, with one bit per stage of frequency bifurcation. The variation required to cause successive stages of frequency splitting decreases with splitting stage, where each new stage requires a parametric ratio change of 1/δ where δ is the Feigenbaum number : δ ≈ 4.66920.

Consequently, every additional bit of storage requires a range extension of log2 δ ≈ 2.22 bits. Thus 6 bits requires that the oscillator parameter — some analog quantity — be selectable over a range of 13.34 bits or about 4 decades of range. The desirable feature is that information is encoded as frequency, and this can be measured to great accuracy, though its conversion for application to other neural inputs of successive stages is problematic. Or is it? It seems that organic neurons operate in part on information encoded by the frequency of arriving pulses.

Two ideas have been presented for the indefinite, though volatile, storage of information in analog form. Volatility is a secondary concern because circuits can conceivably (like organic brains) be powered for the lifetime of circuit use. We have yet to make an attempt at an analog CMAC, and some ideas for it are coming in the next article of this series.

Related posts:

9 comments on “Neuronics: Long-Term Memory, Part 2

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.