Advertisement

Blog

Designing With Op-Amps Is Ending

It just struck me the other day why the time for designing analog circuits on a PC board using things like op-amps, data converters, and other analog pieces-parts is ending. The ability to engineer a quality design at the PCB level is hampered by the lack of software tools and accurate models to thoroughly study a design before committing to production. This hole in the design process makes designing with discrete parts more expensive, more error prone, and, worst of all, substantially less reliable. These weaknesses don't exist in the integrated analog design world.

Actually, it should be easy to accurately simulate analog standard products on a computer. Certainly the IC designers of standard products simulated their designs on a computer. So the accurate simulation models and net lists exist. Why aren't those net lists provided so that those designing with the products can just as accurately simulate their new analog PCB designs? Why are analog designers forced to spend money and time to uncover simple design mistakes that would easily have been caught with an accurate computer simulation?

I guess the answer to those questions is pretty obvious. Analog semiconductor manufacturers choose to not provide a simulation model of their product with that much accuracy and detail. To do so would require them to provide the “source code” of their hardware. Rather, they prefer to provide macro models that attempt to emulate things that they believe are important. But unless one has the full net list of an op-amp or a regulator, the simulation will not be accurate for all use cases. The analog companies know this. That's why their models come wrapped inside a legal disclaimer that is longer than the models themselves.

Contrast that situation with the SPICE models from semiconductor foundries like TSMC, Global Foundries, XFAB, etc. They provide models that are near perfect emulations of their products. Semiconductor foundries live and die based upon the accuracy of their models. Those models also include monte-carlo features that are used by a simulator to show how their devices will perform over several manufacturing lots out to 6-sigma. And all of the effects of temperature and power supply variations are modeled. Basically, they don't hide how their product works or force you to buy some to find the weaknesses. It is right there on display on the computer screen.

The next generation of analog engineers will come from the universities that don't teach PCB analog design. They teach the future, which is integrated. All of the tools for learning integrated analog are available for free to a student. So are the precision simulation models from the semiconductor foundries, as are accurate 3D models of packaging and 3D interconnect extraction tools. All of these things together enable a student designing an analog IC to out-design anything done on a PCB using coarse or non-existent macro models.

If an IC design has a weird nuance, the transistor models provided by the foundries will most certainly show that nuanced behavior. After all, today's transistor SPICE models consist of about 6,000 equations and hundreds of parameters that SPICE uses to compute the instantaneous I-V operating point of a single device.

Earlier today I looked at a 2013 released macro model of an op-amp by a major, highly respected, analog semiconductor company and noticed that its input transistor models included four parameters. Good luck studying the real circuit behavior using that model. I guess the expectation is that designing with op-amps and other small parts requires you to cross your fingers at the system level.

The tools and models available today for designing at the integrated level are so precise and thorough that robustness and yield claims can be validated and proven long before the first part is produced. This level of design confidence is impossible to achieve using multiple parts from multiple vendors on multiple processes with no ability to run any statistical analysis simulations.

Arguably, quality is the most important aspect of any product design. But this is impossible to study when the multiple parts you use in a PCB design aren't modeled or the manufacturer won't share the models with you. You have to base your production decision on building a small quantity of sample product.

In the integrated world, production release decisions are based upon the precise computer aided analysis of the entire system out to at least 4.5 sigma. And this is followed by qualifying the physical IC using thousands of parts and studying those parts from every possible angle including comparing the measured results against the exhaustive computer analysis results.

On a PCB design with stand-alone op-amps and other pieces, you are simply flying blind. That is not a sustainable design methodology.

Does this match up with your experience?

Related posts:

47 comments on “Designing With Op-Amps Is Ending

  1. kendallcp
    June 11, 2013

    Scott, thanks, a very articulate and passionate observation.  There's a part of me that agrees with the post in its entirety.  But I think one critique that may be widely leveled at it is: 'Twas ever thus.  You could easily have written this post at any time in the last, well, many years.  The only thing that makes it really contemporary is the observation that these days great tools and accurate models are free, and really powerful computers upon which to run them are highly affordable.

    But countless successful electronic systems out there have been designed over the past few decades using this 'flying blind' design approach that you (and part of me) consider unsustainable.  Some of the success of those systems relies on analysis instead of simulation.  In the 'good old days' you really had to understand how a circuit worked and how to calculate your error budgets.  This debate about the inexorable replacement of circuit understanding with simulation has been running for years at all levels, and has developed into something akin to a religious schism.

    But of course it all started a long time ago.  The original fathers of SPICE realized that as circuits and device dynamics got more complex, analytical solutions to circuit design would inevitably give way to repetitive simulation as an optimization tool.  What has happened is that the comfortable, now rather quaint world of databook-and-parts-drawer circuoit design has been becalmed in a backwater in which models and support for the designer haven't kept up.  no wonder ASIC and ASSP design is seen by management as the more robust way forward – as well as demanding rather less in the way of old-fashined skill sets.

    You mention about opening the lid on a 2013 opamp model.  You lucky beggar (to employ a clean version of the English saying) to even see such a recent model.  you might have seen in my EDN-reprinted bypass capacitor articles that TI are still pointing people to IC models that date back to 1990 and before.  There is clearly not enough pressure on companies like that to get the models right.  This is what's going to kill good old-fashioned circuit design.  It won't be capable of delivering the predictabvility that modern designers need, if it's rooted in approximations that were made before they were born.

  2. DEREK.KOONCE
    June 11, 2013

    Why aren't those net lists provided so that those designing with the products can just as accurately simulate their new analog PCB designs?   —  Besides potentially giving away their design secrets, I see this not being handed out due to simulation time. If it takes 10 minutes to simulate the chip with all of the discrete parts, imagine the time to simulate a circuit using 10 of the ICs. Thus macro models are supplied.

  3. Scott Elder
    June 11, 2013

    There is SPICE.  And then there is faster SPICE.  IC companies use fast spice simulators (UltraSim is one example) when the simulation netlist gets too large.

    Simulating with 1 Million transistors is not a problem today (an op amp is about 10-100 transistors).  And in mixed mode design, the logic is not simulated in the transistor domain.  PSPICE is not a fast spice simulator by a few zeroes in price.

    I always insist on a full power up transistor level simulation before going out for parts.  This is on a system with 10-bit data converters, track and hold loops, multiple switching regulators, amplifiers, references, programmable memory, etc.

    It is not practical to see the whole system fully operational (steady state), but seeing clocks/oscillators come up, regulators and references up, boot load process starting, etc.  Look at just enough to know that the silicon will have no problems powering up.  Which then insures one can troubleshoot the more subtle performance features if necessary when the silicon comes out.

    An IC company has SPICE for each engineer, a few fast SPICE licenses that are used less often, computer farms to spool out 1000 runs in parallel over night.  Integrated analog design is 98% on the computer, 2% on the bench.  Analog bench work is mostly done by applications engineers, test engineers, and product engineers.  Analog IC designers that need lots of bench time don't stay employed very long.  Lots of bench time means there must have been lots of problems not forseen in the computer analysis.  That excuse doesn't fly much anymore.  The tools and models are just too good.

     

     

  4. Scott Elder
    June 11, 2013
    @kendallcp
     
    I'll take a B+ blog grade any day.  Thanks!
     
    You're correct that the story could have been written a while ago.  In fact, with the change of a few nouns, it could have been written by an engineer moving from vacuum tubes to transistors, then by another engineer moving from transistors to op amp modules and then yet another engineer moving from op amp modules to op amp ICs.
     
    I actually started the blog from another perspective.  The original perspective was from that of a passionate student living in a poor country with no money and only a cheap computer and free or cracked software.  They have no possibility to learn electronics buying expensive components and test equipment.  They only have a cheap computer and time.  Lots of time.
     
    Contrast that against those “lucky” to live in a wealthy country.  They have the ability to continue with the expensive, limited methods which, in the end, enables the ones without a choice to move ahead by default.  The wealthy are crippled by their ability to stay put in the comfort zone.  And there they stay until the wealth moves from the once comfortable to the newly comfortable who had no choice.
     
    Expertise is a funny thing.  It has lots of value up to a point.  And then the tables are turned and the value starts to drop.  Think about the situation you describe.  You, like many others, developed great analog expertise despite all of the hurdles placed in front of you by the analog IC manufacturers.  But the world that was unable to navigate those hurdles for money reasons simply found another cheaper, higher performing way around the problem.  That's always going to be the case.
     
     
    Thanks for reading….Scott
     
     
  5. kendallcp
    June 11, 2013

    Interesting – there's elements of “The Innovator's Dilemma” in the scenario you paint.  Clayton Christensen posited that incumbents in a field underestimate the impact/value/significance of a new paradigm until it's too late and they are toast.  In that context, we can see the various forms of higher integration – whether custom ICs such as you design, or  reconfigurable devices such as I now work with, or good-enough-in-a-box ASSPs, as the nail in old-fashioned analog's coffin.  Which is sad, for someone of my age and influeces.

  6. Scott Elder
    June 11, 2013

    @Derek –

    Did you know that there are more components in a macro model than in the actual transistor circuit?

    Macro models were introduced when SPICE was good for simulating 20 transistors.  Macro models are outdated.  And there is no motivation to change that.  It would simply be a gimic to keep from giving out the full netlist and models.

    A SPICE model is thousands of equations for one component.  A macro model is hundreds of single equation components (i.e. resistor, capacitor, v to i, etc.).

    Think about a macro model for a two transistor CMOS inverter.  The transistor netlist would be two transistors long.  Care to guess how many components would be in a macro model?

    There was a SPICE simulator company a while back that used GPUs to speed up SPICE.  The trick was to compute the models fast.  But model computation only represents 75% of the problem.  So the absolute best case was a 4:1 improvement.  With the introduction of multi-core processors, GPUs are not necessary.  So they are no longer around.

    The fastest/most accurate simulations are for the smallest unique node-count circuits.

     

  7. Brad Albing
    June 11, 2013

    KCP – re [6/11/2013 11:50:02 AM] well said – thanks for a highly descriptive response.

  8. Netcrawl
    June 12, 2013

    Great article, thanks for sharing it, Yes I agree with you @kendall well-established companies in the technology market recognize the need for continual technology development, yet most of these companies remain vulnerable to unforseen market transition or shift. They “lack sustainment”, I believe a company should sustained innovation as a continous evolution of an existing technology to remain competitive and to stay alive in the market, it's a jungle out there!.

     

  9. Brad Albing
    June 12, 2013

    @KCP >>  …incumbents in a field underestimate the impact/value/significance of a new paradigm until it's too late and they are toast…. Which is sad, for someone of my age and influeces. Geeze – you're making me feel like I should just retire now….

  10. Brad Albing
    June 12, 2013

    @Scott – Good material here for a blog – tho' you posted lot's here already – but maybe you can expand on this.

  11. bjcoppa
    June 13, 2013

    The rise of fabless semiconductor companies has been quite interesting to follow over the last decade. These companies tend to focus more on design albeit including partnerships with explicit design firms such as ARM which are leading the mobile device IC world. The fabless semiconductor model has been shown to be very profitable when well-executed as in the case of Broadcom and Qualcomm. Top Taiwanese foundries have gained as a result and become more comprehensive and advanced in their analog and digital IC portfolio along with enhanced design capabilities.

  12. Work to Ride comma Ride to Work
    June 19, 2013

    No need to retire.  We need more people with good historical perspective (a nice way of saying old) to keep the newer folks from going astray. 

    I can hear Bob Pease rolling over in his grave right now at the prospect that we need to spend more time making accurate simulations.  His best simulator was a soldering iron.  I look at simulations as a first-order solution and then spinning a few boards and quality time in the lab to vector in on the best solution.  Much cheaper and faster than screwing around with simulation models for weeks on end.

    This process of designing analog components using piece parts will never go away completely.  There are still too many design requirements that cannot be solved with a catalog.  Certainly, within commodity markets, we will see more off-the-shelf products, ready-to-go.  But there are still many designs going on outside of the consumer world that are not going to be satisfied with one chunk of silicon.

  13. JeffL_#2
    June 19, 2013

    Scott, I respect your view but I don't think someone who designs primarily on SoCs has the same viewpoint as someone who works at the PCB level. Let me elaborate a little. I have an older integrated PCB package that includes SPICE with macromodels. This was an inexpensive package but the benefit to me is that all its libraries are “open”. The company was sold and no longer supports the product but my primary issue is there's virtually NO PCB symbols for the devices in the packages one finds in vogue today (when they are actually present they're not “associated with” the device so it's necessary to rummage around a couple dozen libraries and find them). The first observation is there isn't any commonality in the economic model between the companies that write PCB SW that allows open libraries and the ones that close them. If I had to purchase a package that even had relatively up-to-date footprints I would wind up “licensing a seat” of a closed-library program at thousands a month (like CAD) and I would still wind up being constrained to design with six-month-old parts. I can't afford THAT program, how could I afford one with up-to-date and accurate SPICE models? Besides it's seldom at the PCB level that I have to push the very last ounce of performance out of the analog portion of the design, at that level I can afford to “overspec” a little and that may not be practical on an SoC. As you mentioned the vendors don't provide these models either (occasionally you see something linked on a distribution website but seldom do you even know what chip rev it's for). It's also the case that you're committing a lot more capital to run several million SoCs than a few hundred PCBs like most of us see.

    I also have a disagreement about your statement that went something like “we have multicore so we don't need the GPUs”. The parallelization of code on standard CPUs with multithreads has a very poor success rate and is a horribly manual, trial-and-error process, for as soon as a single cache element is changed the entire cache needs to be refreshed, and most platforms don't even have a stable, reliable method for determining whether a given application is threadsafe. My understanding is that in GPU architectures like the Tesla approach the application performance may improve somewhat with manual tuning but at least the rough equivalent of “threadsafe” operation is guaranteed. In non-GPU architectures (x86) the performance increase of adding even several additional cores may not even reach 100% total, some of that may be because the best the supporting OSes do is normally just SMT (symmetric multithread). I know the leading purveyor of these processors has become primarily a marketing company  but I'm afraid the benefits of this whole approach have been way oversold. A reasonably optimized GPU multicore parallel application still ought to be expected to beat a CPU implementation 7 days out of 7, of course what you get in the “real world” may not even come close but I believe the analysis will hold, at least if you know why it's not I'd like to hear why.

  14. Hughston
    June 19, 2013

    For me, the generic macromodels seem to be good enough because I am just checking the gain and frequency response of a signal chain most of the time and I am not doing very high frequencies. If my answer is with 2% that's fine. But I am using the LT SPICE to do my simulations and they have generic models for their discretes that are nowhere close to what the real model should be. I think people may not be aware of that and are trusting the models to give them accuracy that is not there. The capacitor models and the transistor models are not accurate. I will trust their resistor models because I just want a plane old resistor.  For the capacitors I use the manufacturers SPICE models if it matters for the design. Transistor models should be put in a model statement.

    You guys might need something better than the macromodel since you do high end design work. For people that use these generic versions of SPICE, if they want the better model for the part they better make it or look it up. Or maybe with the case of TI parts, switch to their version of SPICE. I would assume you get better results for their parts but I haven't verified that.

    Isn't it like we always say: SPICE simulation is great when you know what the answer should be already.

  15. BradWood
    June 19, 2013

    It's a bit ironic to see Bob Pease mentioned.  Of course he hated computers and simulators, although in time was forced to use them.  Quite a while ago, however, he expressed the notion that design of anything other than ICs was obsolete.  As my designs used a mix of discrete and integrated components to achieve optimal performance/cost ratio products, which in some cases shipped in the multiple millions, I found Pease's remarks particularly lacking.

    But although the tools for IC design are astonishing and praiseworthy, the notion that board-level design will be abandoned is at best quite premature.  Now for certain applications, one will need to do integrated just to achieve the performance and size reductions attendent on mobile/wireless products, and the volumes will be there to justify the costly development.  But that's a particular area, and by no means the only activity, although those working in the field might well imagine that not much else is going on.

    At the moment I am working on a piece of test equipment for loudspeaker development.  The “client” anticipates selling perhaps a hundred of them.  I'm using a mix of discrete components and integrated circuits, including analog multipliers, rms converters, and a mix of high-performance and commodity opamps.  The good-sized PCB is mostly SMD but includes leaded parts where SM is unavailable or cumbersome.  I have no doubt that the performance will meet specification, based on experience and simulations.  But if the chore were to do things at RF I'd be a lot less confident.  However, the anticipated volumes simply do not support anything by way of custom integrated circuits.

    Another design completed a few years ago was a gauge for determining paint/filler thickness.  Again the volumes were modest and a mix of discrete and integrated components achived adequate performance and very low cost.  The liason company with the offshore manufacturer was disappointed that the device didn't have more “digital” content, despite the fact that throwing a uC down would not have added value and would have increased emissions.

     

    Brad

     

  16. BradWood
    June 19, 2013

    It's a bit ironic to see Bob Pease mentioned.  Of course he hated computers and simulators, although in time was forced to use them.  Quite a while ago, however, he expressed the notion that design of anything other than ICs was obsolete.  As my designs used a mix of discrete and integrated components to achieve optimal performance/cost ratio products, which in some cases shipped in the multiple millions, I found Pease's remarks particularly lacking.

    But although the tools for IC design are astonishing and praiseworthy, the notion that board-level design will be abandoned is at best quite premature.  Now for certain applications, one will need to do integrated just to achieve the performance and size reductions attendent on mobile/wireless products, and the volumes will be there to justify the costly development.  But that's a particular area, and by no means the only activity, although those working in the field might well imagine that not much else is going on.

    At the moment I am working on a piece of test equipment for loudspeaker development.  The “client” anticipates selling perhaps a hundred of them.  I'm using a mix of discrete components and integrated circuits, including analog multipliers, rms converters, and a mix of high-performance and commodity opamps.  The good-sized PCB is mostly SMD but includes leaded parts where SM is unavailable or cumbersome.  I have no doubt that the performance will meet specification, based on experience and simulations.  But if the chore were to do things at RF I'd be a lot less confident.  However, the anticipated volumes simply do not support anything by way of custom integrated circuits.

    Another design completed a few years ago was a gauge for determining paint/filler thickness.  Again the volumes were modest and a mix of discrete and integrated components achived adequate performance and very low cost.  The liason company with the offshore manufacturer was disappointed that the device didn't have more “digital” content, despite the fact that throwing a uC down would not have added value and would have increased emissions.

     

    Brad

     

  17. Brooks Lyman
    June 19, 2013

    As Brad Wood points out, it's really a matter of how many units you are going to build.  I would also add, how critical the design is.  If one is working at high frequency or ultra-low-power or low-noise/very small signals, then there might be a real need for simulation. 

    But a lot of analog design is rather simple and undemanding, and one can often sketch the thing out on paper or in CAD, do a few calculations and build a prototype and have it work first time.  This is also true of small-scale low-speed digital design using discrete IC logic chips. 

    Microprocessors/microcomputers/DSPs certainly have their places, and there are things that are almost impossible to do without them, but for the simple stuff, the coding costs can exceed the development costs of a simple analog board.

    I do a lot of industrial control design.  Today, most of that is done with a PLC, but sometimes a few in-amps/op-amps/comparators and a relay or two, all powered by a small analog power supply with a linear regulator is the simpler way to go.  I've designed dozens of these sorts of circuits, most of which worked first time around (which is important when your “production run” is half a dozen boards).

    So, if you are looking at tens or hundreds of thousands (or more, enjoy!) of units, then going to a custom analog or mixed-signal IC can make sense.  But as some posts point out, there's a lot more to the analog electronics market than these huge, generally ultra-demanding designs.

    Indeed, one problem for the small-quantity board designer is that if one needs to use SM components, the assembly becomes difficult in-house or expensive farmed out.  There's a lot to be said for DIPs and thru-hole components!  Unfortunately, the IC and compopnent manufacturers seem to be chasing the large-quantity, miniaturized market.  This is understandable, but not enjoyable….

  18. Scott Elder
    June 19, 2013

    Thanks for reading and commenting.  It's always nice to know in the cyber world that I am actually engaging with a human!  And I love discenting opinions.  SMILE.  

    < >

    The problem when making a claim that one generation is ending and another is beginning will always be met with skepticism.  I understand that.  That's okay.  People still design with vacuum tubes for heaven's sake.  Probably for good reason too.  So my claims about ending have to do with the 3 sigma segment of the analog industry and those just now coming into the industry who will be around when 3 sigma becomes >4 sigma.

    The problem that I have rendering a decision that a design is solid based upon looking at a “few boards” is you have absolutely no way to know what is on those “few boards”.

    Did all of your vendor's send you nominal parts?  Did you get grade-B data converters because the manufacturer didn't have enough of the lower grade-A that your distributor ordered?  You will never know this because the manufacturer just stamped the plastic with an A since A grade is also B grade, but not the other way around.  So your decision is based upon flawed results.  Flying blind.

    I'm not saying designing how you described can't be done.  Obviously this methodology has been around for a long time–and as I wrote, it was out of necessity.  I'm simply pointing out that the methodology has a big weakness from a quality perspective.  And it is also not how the parts that you use on your designs were designed.  They were designed with a lot more rigorous attention to quality.

     

     

  19. Scott Elder
    June 19, 2013

    @JeffL_2

    Thanks for commenting!

    I believe we are agreeing with one another, but maybe I'm missing your point.

    You are basically writing that you have real and practical constraints placed upon you that limit your ability to proof a design before you buy any parts.  And that was my point.  The PCB world has been left behind because everyone in that domain pretty much is suffering from the same practical constaints.

    But working on a project with a budget that can't support a $100,000 tool, doesn't give one a pass on quality right?  It simply means that additional risks need to be taken because the budget is what it is.

    I've been in many tense meetings where if my response to the customer was, hey…I looked at a few parts on the bench and everything was fine….I'd be asked to leave the building.  The expectation is that full rigorous analysis has been run on a design.  And the data is there to back up my claims.

    When I first started in analog, things weren't that rigorous.  There weren't ISO9000 type organizations or expectations.  Now that's standard.  So the question is, how does one working at a different level in analog design satisfy rigorous QA standards?  Is there a different standard of risk acceptable simply because one only needs 500 boards of a design not 50,000?

    It seems to me that the solution at small volumes is to move up the food chain to the next higher level assembly that is sold in larger quantities and consequently undergone a more rigorous analysis.

  20. Scott Elder
    June 19, 2013

    Hi Hughston!

    < >

    I always tell the less experienced engineers that SPICE only answers the questions you ask.

    Perhaps the PCB designers should organize a push towards manufacturers giving access to the real circuits, not models.

    This can all be done if they wanted to.  You run SPICE remote on their server and never get access to the models and netlists–only the simulation results.  And they should just provide the SPICE engine for free.  It seems like this would help drive more business because obviously they would only be able to simulate designs that used exclusively their parts.

    I think the big guys have maybe 70,000 customers total.  So how many computers does one really need to support a group that small?  It's not like everyone will sit on a computer 24-7  for months on end.  Only we IC guys do that….

     

  21. Scott Elder
    June 19, 2013

    Hi Brad –

    For the two examples you cited, is there nothing else available at a much higher level of integration that you could use?

    I'm always curious about what was missed after 50 years of analog IC engineering.

    I recognize that maybe it might be a lot more money, and maybe half the IC would go unused, but at a small volume does that really matter?

    I own about 5 Intel GFLOP computers in the house and only one does anything that needs that much horsepower.  But at $200 a processor, who cares.  Why doesn't this apply in analog?

    Thanks….Scott

     

  22. JeffL_#2
    June 19, 2013

    Scott,

    Thanks for reading my reply! I'm just as sure you didn't mean to imply I “don't design for quality”, in fact the situation is I now do software certification to DO-178C and often to Level A which is probably the highest level of software quality in common commercial use. The fact is I don't really support myself designing analog circuits anymore. WAY back when I did, simulation was unaffordable and the goal of a “conservative” design was one which would work whenever parts matching the numbers listed on the formal BOM were built into the board.

    If I had designed a circuit that specified “you have to use only rev D of that chip, it's the only one I designed the circuit to work with” I'D have been fired! The same would have happened (if I COULD have afforded to simulate back then) if I'd asked for sufficient time and resources to simulate “chip 1 revs A through E against chip 2 revs B through F against…”, and THAT would be further removed from possibility by the unlikelihood that I could obtain and identify the correct models for revs A through E etcetera, and if the vendor discontinues everything but a new rev F do I start over again?

    Do you see what I mean here? I'm not saying either one of us is right and the other is wrong. I remember when logic designers used to come to me and proudly announce they'd done a “worst case analysis” on their design. When the showed the design to me I'd ask “I understand where you got the maximum prop delay for that gate, where did you get the minimum?” and he'd reply with something like “oh it's not on the data sheet, but EVERYONE KNOWS it's one-fifth the max” or something.

    During my analog days there's ALWAYS been “inscrutable” parameters that a vendor isn't willing to put on the data sheet, I guess maybe it was a simpler time because we never felt we could get ALL the parameters spec'd (or even all the critical ones, that's why there was a category, however derided, called “select in test”, we'd try and minimize it but it was generally futile to try and eliminate it).

    Nowadays I only get to do PCBs when there's no existing equivalent for form, fit, function available, I have to admit if there were a couple dozen other vendors with exactly compatible designs I'd have a totally different tune. But either way you can do a thorough and practical design, I'm saying they're just from different standpoints, see?

    Jeff

  23. Etmax
    June 20, 2013

    This sounds like it was written by the same type of person that would be in the job of designing and buiding super computers and then say that the microcontroller is dead. No one design methodology can ever be all things to all people. There's DSP people out there who think analoque is dead. These nay sayers are all just simply too unimaginative to see anything outside their area of expertise. The Chip industry is a multi billion dollar industry precisely because there are so many niches needing fillers. By the “Designing With Op-Amps Is Ending” argument humanity would mean that ther are no longer any cockroaches and ants 🙂 well we know where that argument's gone.

  24. Navelpluis
    June 20, 2013

    Thanks Scott for your very interesting article. It gave me double thoughts. 

    I have been designing reference designs for one of the major FPGA companies in the 2003/2004 years. Before I was kicked out as a supplier they went to India to try to design it there, cheaper of course. What I found out at that time is that good basic knowledge of analog and RF design is a *must* while designing PCB's to run the speeds at that time (approx. 2GHz on FR4, wide wide 64bit buses ! ) You can expect the outcome of the India 'experiment': It was a total failure. Pity for our little company: US management ordered 'nothing critical goes out of the US' and so we lost this FPGA vendor as our customer.

    But the funny part is that after 5 years I heared that a big US contractor got one of my reference designs. They tried to build a board with the same FPGA and structures as proven in our design. The FPGA vendor had lots and lots of trouble to get the design running, even if they had 'board simulation tools' from -for example- Mentor Graphics. And this was the failure: I don't believe in 'board verification', but only solve 'local' problems. The pitfalls were the FPGA footprints with the high speed and how to break out the signals from them. It took them >6 iterations of a 12 layer board to get it running. But eventually with a bit of help (via us) they got it running. Very succesful product, telecomms, high volumes.

    Scott, I am talking about PCB design, please don't forget: I think there is a huge demand for all-round analog designers to solve these kind of problems today. Opamp designs for sensor circuitry is one of the issues we still see a lot, not to speak of RF discrete designs, things we really love to do here. Lots of engineers find this 'black magic' and an inductor is something 'you better not use' 😉

     

  25. Brad Albing
    June 20, 2013

    @BradWood – thanks – some well thought out and nicely stated comments here.

  26. Brad Albing
    June 20, 2013

    @Scott – seems plausible at first – tho' I'm not sure what would be good examples of such analog parts. Seems like some devices made for high levels of audio, video, or RF processing in home entertainment equipment might be usable for some other application. Maybe s/o else can cite examples.

  27. Scott Elder
    June 20, 2013

    Hi Navelpluis-

    Thanks for taking time to share your thoughts.

    As you point out, to solve any problem one needs great tools and great engineers.  I've just been a bit surprised about how the design requirements for quality in analog hasn't translated to the board level yet.  For some reason it is still okay to release a board level analog system to production having seen only a handful of boards using a plethora of different ICs from different manufacturers and nothing studied in detail.  Try that level of qualification with an IC and those same board level designers will cut you off immediately.

    TI and Analog Devices share the same 70,000 or so world wide customers.  And they both sell about 2000 different op amp part numbers.  If I added in every other analog IC company in the world, the number of op amp part numbers would go up more than double but the customer count wouldn't change much.  Does our industry really need that much diversity in op amp selection?

    Somehow the entire world is able to solve the majority of their computing needs with a design from one company–Intel.  But analog….we need 4000 different little 8 pin amplifiers hooked up to all sorts of other little 8 pin parts.

    Analog is said to be art as much as science.  Perhaps that's the problem.  Artists always want to create something unique.  But then artists have traditionally always been poor also.

     

  28. Brad Albing
    June 20, 2013

    Sad to think that there are engineers who are intimidated by reactive components. Guess they are scared to wander away from the real axis. “Keep that 'j-omega' stuff away from me!”

  29. Navelpluis
    June 20, 2013

    Hi Scott,

    I agree with you that the enormous amount of parts from the different vendors indeed is silly. With the merge of National Semiconductor into TI I expect a shake-out of lots of old parts quite soon, and still, plenty are around to choose from. Too much, indeed.

    But please mind the following: Try to design a good chopper amplifier for sub frequency range, 140dB dynamic range, extremely low noise. Most engineers think there must be a component around doing this, but hey, one might get disappointed. Doing this on a chip is not trivial due to capacitor size limitations.

    Let us indeed call it artwork to design this 'problem' by using discrete components. Indeed it feels like a kind of art. Apart from choosing the right components, the board layout plays a *big* role in the performance of our 'problem'. What I want to say with all of this is that the role of proper PCB design often is very underestimated. 

    I might say that this knowledge is one of our money makers. Indeed, we do not simulate all, due to lack of good models and sometimes lack of tools and… don't forget, lack of time. Accepting work from a customer might be a risk due to these facts. But we learned to say *no* if some work comes along that goes 'over our heads'.

    Anyway, I agree that over time, a pure analog designer will loose 'ground' in the 'field of engineering'. And I am one of them. Good to think about this.

    Scott, so you are right in your opinion, but until I am old I probably will solve lots of interesting stuff, hate Windoze + buggy CAD tools. Love our physical circuit testing + measuring equipment. And loving to drive around in my 10 years old '911' in my scarse spare hours 😉

    Any good tips how people like me can move on into analog chip design: I guess analog minds around here will be very very interested, so, to my opinion, any good article + tips about analog chip design is most welcome.

  30. Scott Elder
    June 20, 2013

    < >

    @Navelpluis – I've got a couple of blogs on here dealing with how to learn integrated analog IC design.  http://www.planetanalog.com/author.asp?section_id=526&doc_id=559519

    Look me up on LinkedIn (http://www.linkedin.com/pub/j-scott-elder/48/217/717) and ask away.

    Thanks for taking the time to kick this around.

    Scott

     

  31. Scott Elder
    June 20, 2013

    Hi Jeff —

    < < I'm just as sure you didn't mean to imply I "don't design for quality",>>

    Are you kidding?  My face is plastered all over this site and I still like to walk in public. 🙂

    But you're right about this:

    “I have to admit if there were a couple dozen other vendors with exactly compatible designs I'd have a totally different tune.”

    We both agree here for sure…even in the integrated domain.  I wrote about this a couple weeks back.

    The world is too big to think that the majority of people are all working on unique one-off analog designs never before seen.  Its really about whether something is good enough and then how important is full optimization around price/performance when one is talking about 100-1000 boards.

    There's always going to be that >3 sigma use case, but I doubt the analog semiconductor industry will be able to support many of those as the volumes/prices for discrete parts continue to drop and the volume for high level system parts continues to rise.

    Its just a natural maturation of the industry.  Same thing happened to all the corner custom PC shops in town.  I wonder how much money vaporized in PCB discrete analog when the world decided they'd buy their future PC products only from Apple or Samsung in a tablet or phone format?

    Thanks for reading, Jeff.

  32. WKetel
    June 21, 2013

    I have some news for you, which is that people have been producing good designs that were built on circuit boards for a long time. In fact, for quite a while before the circuit board layout programs were generally available. And then they started using transistors and IC op-amps, and comparator, and all manner of chips got involved, op-amps and cmos logic and other stuff as well. 

    Of course the latest and greates expensive design software does make it easier and simpler, and assures more people of a first time success, so there is an improvement factor there. But some of us have been doing it for a long time, doing the math with a spreadsheet or on paper, or with a calculator. 

    MY point being that just because you don't have the tools that let the brainless do a design is not a reason that the design can't be done, at least as well, possibly better. The results from using a simulation model are seldom any better than that model. ” (Bob Pease said that first), and it is still true today. Just because a task needs some effort and brain work is no indication that it can't be done, and done  right.

    Besides all of that, would it EVER make sense to create a custom ASIC for a production run of a dozen board? Or even a hundred?

  33. Scott Elder
    June 21, 2013

    < >

    I'm sorry, but that statement is just simply wrong.

    < >

    No it usually doesn't make sense.  So the future will be either a custom where it makes sense or a higher level IC where a few bucks more are spent to get the job done quick with a reliable solution used by many.  Move the analog problem as quick as possible into the digital domain and then solve the rest of the problem with software.

    I appreciate that you might not like that, but that's where things are headed.  The economics of designing a 10 board circuit where $100,000 engineering talent sits around thinking about 0.5 cent resistors and where to get them, when to order them, where to stock them, etc. isn't a sustainable business model for much longer.

     

  34. WKetel
    June 21, 2013

    Some models are better than others, but that old statement remains true, “Garbage in-Garbage out”. And you might explain how simulation results can be more accurate than the models used for that simulation. To get accurate results it is usually a requirement that both the model and the data be accurate. And, Not every supplier provides the very most accurate models for all of their linear devices, some models are “simplified a bit” and while they work for some conditions they may not be accurate for all conditions. 

    A complete model for some of the analog ICs that I have seen circuits for would be quite complex, and the model would be “quite large”, and the simulation of a circuit that had a lot of  components, and used that model might not converge for quite a while, even moreso if the circuit had multiple op-amps.

    I believe that was the point that Bob Pease was making. And since he was a very good analog engineer, a wizard, in fact, I don't challenge his judgment on that point. 

  35. Scott Elder
    June 22, 2013

    @WKetel

    I think you and I are discussing two different topics.  The models used by an analog IC design company are nearly perfect.  The models provided by those companies to the users of their parts are far from perfect and in most cases not very useful.  And that was the point of the blog.

    An op amp is 10-100 transistors.  That takes less than 60 seconds to fully simulate using the tools of a semiconductor company and the models are dead on exact.  The entire startup and settled regulation of a switched mode power supply IC using the perfect models might take a few minutes up to one hour.  And again be dead on accurate.

    An analog IC power supply designer can spool up about 200 monte carlo runs of their full transistor level design overnight using the corporate computer farm network and look at the histogram distribution of performance the next morning before parts even arrive from Digi-Key to build the board. [line reg, load reg, startup time/transient, load transient response, thermal response, all inductor corner cases, esr, esl,….an infinite list of things that can be studied]

    The playing field is not level.  That's my point.  And that's why I believe the analog industry is at a turning point where PCB discrete analog becomes less and less viable.  Both from the ability to thoroughly study a design before it is committed to production as well as just running the application simulations prior to prototyping.

  36. WKetel
    June 23, 2013

    There is a great deal more to the realm of analog designs than switchmode power supplies. 

  37. Brad Albing
    June 23, 2013

    Well, certainly. Can you expand upon that a bit in thia context?

  38. WKetel
    June 24, 2013

    Aside from switchmode power supplies and chargers for the various Lithium battery types, we have motor and solenoid drivers and all manner of audio amplifiers. Also a wide realm of RF systems, including power amplifiers and very low noise receiver systems. Then there are those amplifiers and controls associated with instrumantationthat must be DC accurate as well as having perfectly linear response up to frequencies inthe KHz range. All of this aside from the realms of “consumer products”. These are all in the industrial arena, not produced in quantities large enough to merit an ASIC. Some not built in quantities karge enough to merit a circuit board design.

  39. audiocal
    June 25, 2013

    I guess it depends on how far you are pushing the op amp parameters. We always design a minimum of one decade of gain and/or bandwidth away from the GBW limit, usually more since we don't have really high frequency requirements. For these cases the SPICE macromodels work fine to show any gross design errors. We make a well-educated guess as to the parasitic capacitance across the op amp inputs and cancel the phase margin reduction from that with a feedback cap across the feedback resistor. This conservatice design philosophy works well for both SMD and thru-hole PCB layouts.

  40. Brad Albing
    June 25, 2013

    OK – this helps clarify. We should discuss further in the context of blogging.

  41. Scott Elder
    June 25, 2013

    < >

    That's certainly one aspect.  But then what about things like rejection of power supply noise into the signal path.  Stability if the load is reactive.  Settling time driving an ADC input.

    The list can be pretty long.

     

  42. WKetel
    June 25, 2013

    It seems that I forgot to even mention the designing of printed circuit boards, which is where those opamps and other things live. I hav gone through one company circuit board rule book and my conclusion is that if the board designers know nothing about the corcuit being laid out, then probably that 250 page document is appropriate, but if the same person who designed the circuit is doing the layout it may work well with only a minimum of rules. Understanding the circuit probably trumps any number of rules, since exceptions almost always arise, and the layout drafter probably does not have a clue. I know that was the situation when I worked at Methode. One artwork person understood electronics well enough to know to ask for explanations, the other person did not have a clue and was convinced that the rules covered everything. They sort of did, except for the certainty of first-spin success.

  43. Brad Albing
    June 26, 2013

    @Scott – and you're just scratching the surface there with complicating factors.

  44. Brad Albing
    June 26, 2013

    @WKetel – Which is why the design engineer should be looking over the shoulder of the layout guy. Tedious – maybe even annoying – but it leads to higher quality results/fewer respins.

  45. WKetel
    June 26, 2013

    It was quite a challenge the most recent time, when the PCB layout designer really did not know between op-amps and apple dumplings. At that point it would make more sense for the engineer to do the layout and then the PCB expert check for rule violations. But that did not happen, instead we just kept correcting the problems. Tedious but effective.

  46. PCR
    June 30, 2013

    Very true Natcrawl, it Is a compulsory that companies should always lock at the continues development of the technology in order to survive w in the very competitive market.  

  47. David Maciel Silva
    June 30, 2013

    Several times we had to stop and analyze the layout, until I began to trust more in charge of the layout, then I taught him to analyze the data sheets and how it should be done mainly ground planes.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.