This is an industry of never-ending change, and analog is not immune to that wave. While looking over some product announcements recently, as well as a stack of design articles (yes, the paper versions are great for browsing and grazing), I got a stronger sense of how the analog-centric design focus has shifted in the past few years.
How so? As recently as ten or even five years ago, the emphasis of many analog parts and vendors was on improving the signal/data acquisition channel, beginning with better building blocks, such as op-amps and converters. Among the critical parameters were input offset, bias current, drift, linearity, stability, resolution, bandwidth/speed, and noise; you can add your own parameters to the list here, of course. Other technical stories were often about proper application of these parts to maintain their raw performance, as there's little point in having a great part if your layout, power rails, or grounding scheme negated a component's virtues. (For my take and reader comments on the dangers of using that term "ground," see (Under)standing Your Ground.)
Where's the emphasis now? First, it's on power consumption in various modes -- active, standby, quiescent -- and efficiency. Second, it's also on broader issues more than individual parameters: signal integrity, system noise, EMI/RFI, regulatory concerns, compatibly with (and conformance to) industry standards, drive capability, slew rates, spectrum use and management, filtering, and ESD and other protection.
Why is this so? Here's my hypothesis: The basic analog parts for the signal channel have gotten so good that they are no longer the gating item or bottleneck in performance, for most applications. This is a result of enhancements in process technology, component design, packaging, test, and trim/calibration at the component level, plus use of integral memory and system processors to implement techniques, which can, in many cases, overcome a part's limitations. For example, less-than-perfect transducers, with unavoidable non-linearities and drift, can be individually calibrated at the factory with corrections stored in the transducer, the probe, or the instrument itself.
Of course, there are leading/bleeding-edge applications that need that next increment (small or large) of performance, such as converters for software-defined radio (SDR), mil/aero projects, or whatever is going on at the Large Hadron Collider at CERN. However, the bulk of the applications out there today are served pretty well by what's available. Parts have gotten better in so many ways, and at lower power and cost, that the incremental gains realized by better performance are quite modest in the big picture.
The designer's challenge now is getting the overall system to meet design objectives, which requires very good components, for sure, and all working together so that the aggregate performance, power, and price points are met, and on time. Whether this is done with individual, discrete-function ICs (such as op-amps, in-amps, A/D and D/A converters, and voltage references), more integrated SoCs, or highly integrated, application-targeted ICs -- those are the alternatives a designer must consider and weigh.
What's been your experience with the change in analog focus over the years? Is it a good thing, a bad thing, neither, or maybe both? Has it limited your options or has it broadened them? Has it made your project easier, harder, or just different?