Advertisement

Blog

Designing With Op-Amps Is Ending

It just struck me the other day why the time for designing analog circuits on a PC board using things like op-amps, data converters, and other analog pieces-parts is ending. The ability to engineer a quality design at the PCB level is hampered by the lack of software tools and accurate models to thoroughly study a design before committing to production. This hole in the design process makes designing with discrete parts more expensive, more error prone, and, worst of all, substantially less reliable. These weaknesses don't exist in the integrated analog design world.

Actually, it should be easy to accurately simulate analog standard products on a computer. Certainly the IC designers of standard products simulated their designs on a computer. So the accurate simulation models and net lists exist. Why aren't those net lists provided so that those designing with the products can just as accurately simulate their new analog PCB designs? Why are analog designers forced to spend money and time to uncover simple design mistakes that would easily have been caught with an accurate computer simulation?

I guess the answer to those questions is pretty obvious. Analog semiconductor manufacturers choose to not provide a simulation model of their product with that much accuracy and detail. To do so would require them to provide the “source code” of their hardware. Rather, they prefer to provide macro models that attempt to emulate things that they believe are important. But unless one has the full net list of an op-amp or a regulator, the simulation will not be accurate for all use cases. The analog companies know this. That's why their models come wrapped inside a legal disclaimer that is longer than the models themselves.

Contrast that situation with the SPICE models from semiconductor foundries like TSMC, Global Foundries, XFAB, etc. They provide models that are near perfect emulations of their products. Semiconductor foundries live and die based upon the accuracy of their models. Those models also include monte-carlo features that are used by a simulator to show how their devices will perform over several manufacturing lots out to 6-sigma. And all of the effects of temperature and power supply variations are modeled. Basically, they don't hide how their product works or force you to buy some to find the weaknesses. It is right there on display on the computer screen.

The next generation of analog engineers will come from the universities that don't teach PCB analog design. They teach the future, which is integrated. All of the tools for learning integrated analog are available for free to a student. So are the precision simulation models from the semiconductor foundries, as are accurate 3D models of packaging and 3D interconnect extraction tools. All of these things together enable a student designing an analog IC to out-design anything done on a PCB using coarse or non-existent macro models.

If an IC design has a weird nuance, the transistor models provided by the foundries will most certainly show that nuanced behavior. After all, today's transistor SPICE models consist of about 6,000 equations and hundreds of parameters that SPICE uses to compute the instantaneous I-V operating point of a single device.

Earlier today I looked at a 2013 released macro model of an op-amp by a major, highly respected, analog semiconductor company and noticed that its input transistor models included four parameters. Good luck studying the real circuit behavior using that model. I guess the expectation is that designing with op-amps and other small parts requires you to cross your fingers at the system level.

The tools and models available today for designing at the integrated level are so precise and thorough that robustness and yield claims can be validated and proven long before the first part is produced. This level of design confidence is impossible to achieve using multiple parts from multiple vendors on multiple processes with no ability to run any statistical analysis simulations.

Arguably, quality is the most important aspect of any product design. But this is impossible to study when the multiple parts you use in a PCB design aren't modeled or the manufacturer won't share the models with you. You have to base your production decision on building a small quantity of sample product.

In the integrated world, production release decisions are based upon the precise computer aided analysis of the entire system out to at least 4.5 sigma. And this is followed by qualifying the physical IC using thousands of parts and studying those parts from every possible angle including comparing the measured results against the exhaustive computer analysis results.

On a PCB design with stand-alone op-amps and other pieces, you are simply flying blind. That is not a sustainable design methodology.

Does this match up with your experience?

Related posts:

47 comments on “Designing With Op-Amps Is Ending

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.