In college, we were taught analog design, RF design, high-power design, and digital design. To me, digital was so much simpler. You could forget about things such as voltage and current, sensitivities, and so many other factors. With analog, the idea of putting, the signal processing, power, and RF(!) all on the same piece of silicon seemed impossible.
Abstractions appeared to work, in that you could abstract transistors in specific configurations to become primitive functions, such as AND gates. Those gates were packed up into regularly performing functions such as those provided by Texas Instruments in its 7400 family. So long as you conformed to loading limits, you only had to consider “1” and “0,” and if you really needed to mix logic types like TTL and ECL, you used converters between the blocks.
By the time I graduated college, I almost knew the 7400 series by heart and only had to consult the manual every now and then to check on pin assignments, etc.
Those abstractions enabled me to design bigger things. I could design a whole computer in a day and didn't have to fiddle around with checking corner cases and deal with component selection or to make sure the tolerance of the resistor I specified would work. A single day! I could load it up in a simulator, using readily available models, or could construct the models fairly easily if they had already been created, and quickly iron out the most fundamental problems. Running the software would find the rest of the problems after it had been built. I can't imagine being able to do an analog design in such a short time – if at all.
My first computer was constructed using printed circuit boards I made myself with a solid copper sheet and acid etch. I guess it was at about 1/8″ to 1/4″ geometry. For the ones I couldn't route, I used wires, and, yes, it looked a mess by the time I was done, but it worked.
It ran my own operating system, Brian's Operating System (The BOS). I had problems with only one part of the whole design — designing and building the interface to the cassette tape recorder, which was where my software was stored, my bulk storage. I basically just used a dual tone to record my ones and zeros and added in a few parity bits. That was a clear indication to me that I did not have what it took to become an analog designer.
For the next 30 years I was happy that analog was off in the distance. Data was converted into digital, and then I could work with it and hand it back to DACs at the end. But as designs employed smaller nodes, and extreme cost pressures led to the requirement for analog and digital to come onto the same die, it became impossible to ignore it any more. Luckily, by that time I was no longer designing chips. With little at stake I quickly came to realize how “greedy” digital circuitry is. It really is not a very good citizen. Also, many of the choices made in process technologies are for the benefit of digital and not analog.
In my blogs, I will be looking at the world of analog and digital and, most importantly, the impact they have on each other. How can they be made to play nicely with each other, what are the problems, what fixes exist, what fixes should exist? How are processes evolving so we can faster enable the integration of more analog functionality onto a piece of silicon?
This is my goal here on Integration Nation, so while scanning the horizon for new developments, I am all ears too. What are you hearing? What do you want to hear about? Just leave a comment here or start your own message thread. Either way, my electronic door is always open, and I invite all of you in.
Let the fun begin!