Although analog circuit design techniques continue to advance rapidly, and companies can still gain a competitive edge through artful circuit design, most observers anticipate a future of incremental gains in pure analog circuit design for individual functions.
Moreover, commercial pressure to achieve high analog performance using low-cost process technology, such as deep submicron CMOS, makes it even harder to deliver high performance. This is because the trend towards deeper submicron process technologies is characterized by lower-voltage supply rails, lower intrinsic gain of active devices, worsening Vt mismatch of MOS transistors, and poorer-quality passive structures.
While it is true that we don't always have to design our analog circuits in processes optimized for high-density digital designs, commercial pressure to deliver more analog performance with fewer and lower-cost IC process options is universal. As a result, companies that do deliver higher performance with less-exotic technologies are most likely to have the competitive edge.
Over the last decade, the balance of forces has changed. Low-area component matching has become more difficult, whereas the area and power required for digital calibration circuits has decreased with Moore's Law. In the face of such trends, it should come as no surprise that classical analog design techniques are being augmented by digital calibration, adaptive signal processing, and nonlinear correction methods.
Some common examples of digital algorithms that improve delivered analog performance are:
- Dynamic element matching in ADCs
- In-phase vs. quadrature signal path gain and phase correction
- Digital predistortion (DPD)
- DC offset correction
- Digital frequency offset correction
- Frequency-dependent group delay compensation and gain droop compensation
These algorithms should ideally be capable of adaption using mission-mode signals continuously (also known as background calibration). In some cases, foreground calibration using special purpose signals may be needed, but these should still be autonomous and not require factory test time. A nonintrusive example of foreground calibration could be initiated at power-on-reset and managed by an on-chip state machine. The least desirable solutions involve factory-test calibrations that add to test time and cannot adapt to changing voltage and temperature conditions in the field.
One difficulty in creating solutions where digital and analog circuits are highly interdependent is that engineers often are specialized in one domain or the other, and often will only seek solutions inside their specialization. The old adage applies: If you have a hammer, everything looks like a nail! To overcome this, the best development organizations will employ cross-functional teams with good communications skills and (ideally) a few key individuals who have strong skills in both domains. Equally important is to have an open culture where engineering managers facilitate cross-domain collaboration and don't favor one engineering discipline (theirs!) over others.
What do you think? Does analog circuit design do best when it keeps itself pure, or are the best modern analog designs to be had by analog and digital making friends through the incorporation of judicious use of digital signal processing techniques?