I was doing some reminiscing about my college days and comparing to what is being taught today. One of the things that I clearly see is that schools focus more on the digital side of electronic design than the analog. Also, there is more simulation work than hands-on circuit building. One of the interesting things I remember using was the analog computer.
This was a neat item to work on during the lab sessions of classes. For those that may be too young as to what it is, the analog computer was a nice tool to fully understand the basic circuit concepts to add, subtract, multiply, divide, differentiate, and integrate. The lab work basically takes an equation and the student would need to put the appropriate modules together to solve the equation. Though it sounds simple and mundane by today's standard of using a simulator or a computer, the challenge was figuring out the voltage levels one could use without saturating the output of the various parts.
The adding and subtracting parts are very easy to use, since they are in basic op amp circuits all around us. The challenges were the other devices. Using a differentiator could easily cause saturation if the step was too small.
The nice thing about today's circuitry is that it is faster and uses less power than before. The best part about analog computers is that the signal does not need to be digitized before processing. Only delay times kept things from being figured out immediately.
Thinking back through my design years I can recall two instances where the analog computer thought process was used extensively. The first was an image processing system and the other was used for mimicking a Wheatstone bridge.
The image processing project was looking for a rate of change outside a vehicle as the system was to detect an incoming projectile. The concept was to project a fan of light and then look for reflections off the incoming projectile, which was expected to see a set number of reflections based on the projectile geometry.
My rough sketch of the scenario is below:
The trick was that any processing time had to be quick since time to react was crucial. The program took two paths to solve the problem, an analog and a digital. I got to play with the analog design.
The thought process was to take a linear CCD and use it as an analog shift register. There would be successive scenes to compare with and then generate a signal if there was any consistent detection.
The circuitry would then compare the signals of the current imaging time and successive image times to see if there was a high reflection.
The analog circuit was not the real problem as much as obtaining the analog shift registers. This project was done back in the early 90s as companies were phasing out the analog shift register. I ended up using a linear CCD to feed a signal into the white balance input. This design did not go very far due to the issue of component acquisition for future use. This type of CCD configuration was being phased out.
The second analog circuit development was essentially making a DC strain gage act like an AC excited strain gage. The following circuit block is what was used:
Essentially, A1 acts as the subtracting amplifier, since it is a differential amplifier with divider resistors to drop down the high AC voltage. A2 is used as a phase shifter so that the output is always in phase with the input AC reference. The DC input goes into buffer A3 and A4 is used as a reference to handle the non-zero volt zero reference from the DC input. The multiplier circuit then takes the AC reference as the X1 signal. The X2 signal is grounded. The buffered DC input is fed into the Y1 signal and the A4 reference is fed into the Y2 signal. The multiplier then performs the simple task of (X1-X2)*(Y1-Y2). There is some additional signal processing, but this circuit worked out very well.
So, the big question to you readers, what type of analog circuits have you worked on that digital processing just would not be the best solution?