Enabling Analog Integration

I’ve seen quite a few comments in this section along the lines of “What’s really new here?” Is there some new technological enabler that makes it now possible to integrate more analog than ever before? What’s analog’s equivalent of Moore’s law?

While there is no wand that I can wave that will cause a clear and decisive answer to appear, there are a number of factors that should help clarify. Some factors are technology based; others relate more to the culture and mindset of the company doing the integration. I will try to be specific about the enablers that I can enumerate.

Process technology innovations
While digital CMOS technology has been shrinking by regular factors of two every two years, it should be noted that digital CMOS contains only transistors and interconnects. This means that fully digital ICs can benefit fully from reduced line widths. This is in contrast to analog circuits that need a significant amount of capacitor and resistor area. Many analog IC layouts are dominated by capacitor area and may make use of on-chip inductors as well, which are also area-intensive.

Process technologies that can provide higher capacitance per square millimeter can obviously make a big difference here, and that’s an area where you can expect innovation from companies that have their own process technology research such as IBM, NXP, and Maxim Integrated.

The basic parameters that define capacitance (relative permitivity, total area, and distance between the “plates”) are all subject to innovation in IC process technology. In particular, the use of deep trench structures, as contrasted with traditional planar techniques, can dramatically increase the effective area per planar mm2 of capacitors.

A quick web search on this topic will reveal just how active an area of innovation this is. Another area of process innovation is the ability to handle high voltages and high logic gate densities on the same die, using bipolar, CMOS, or DMOS (BCD) technology at 180nm and below. This allows for new classes of devices such as the power SOCs that integrate a microcontroller along with efficient power-management functions, audio codecs, battery “fuel gauges,” and more.

Digitally assisted analog
Where analog circuits have inherent limitations such as non-linearity, gain variations, DC offset accumulation, and frequency response accuracy (to name a few), it is often the case that digital monitoring and control can substantially correct these deficiencies, with 100 percent repeatability and high precision. In this approach, the main signal path remains analog in nature, and the digital assistance is (ideally) made transparent to the user. From an application point of view, the IC may appear like a traditional analog function, but in terms of the key analog performance metrics, it outperforms a competitor that is purely analog.

Another approach lies in the aggressive digitization of previously analog signal paths, which rely on the availability of ADCs and DACs with sufficient bandwidth and dynamic range within the power budget of the application. This is becoming increasingly feasible as ADC technology improves, and should always be considered for new designs.

Nevertheless, it is rare that the entire signal path can be digitized and the appropriate partitioning largely determines the performance, power consumption, and cost tradeoff of the device. Judicious use of digital technology to enhance or partially replace analog signal paths is a key area for innovation and differentiation.

Design methodologies
As we approach system level integration on mixed-signal SOC designs, we need new design methodologies to manage the resulting complexity and verify the solution does perform as intended (and that we have the correct intentions, i.e., requirements, in the first place). This involves systematic methods and tools for requirements capture and traceability, along with the use of top-down design and verification methodologies that encompass mixed signal designs.

We also need tools that eliminate the discontinuity (and redundancy) in requirements representation and design representation that cause unproductive work, hide errors, and separate the system designers from the circuit designers. This is known as “single-source” design information. These system-level approaches to the design flow require a considerable cultural shift in an organization and executive-level support to implement. Nevertheless, this whole topic is a key enabler and cannot be neglected.

A company can have technology coming out of its ears and methodologies galore, but may still not succeed commercially unless it has a “can do, will do” attitude to getting things done and always challenges the technological status quo. Innovation and value creation occur when we have the guts to challenge previous limitations, and the technical skills that allow us to succeed from time to time. I must also mention that the execution discipline (which is also an attitude) is mandatory to make sure innovation becomes a shipable product in a timely manner.

12 comments on “Enabling Analog Integration

  1. eafpres
    February 15, 2013

    Hi Charles–interesting ideas on driving forces for analog improvements.  I'm sure you watch the daily announcements in nano-technologies.  It seems to me that there is good work in nano-wires, such as carbon-nanotubes, which might be applied to some of the issues you note.  For instance, building high-value inductors in smaller spaces using nanowires vs. lithographic/metallization techniques.  Also, in some high power analog, might there be use for higher current capacity but small wires to reduce size without heat problems?  Again, nanowires could apply here.

    Do you have any thoughts along these lines and do you know of any work leveraging nano-technologies for analog integration?

  2. Charles Razzell
    February 15, 2013

    @eafpres, you make an excellent point. Fabrication of the carbon nanotube inductors is clearly very challenging, but the rewards of success here could be very high. Recently progress has been reported by IBM in the area of carbon nanotube transistor fabrication in an IC process, which has a different objective (ultra-high density logic). It is also worth mentioning new possibilities arising from the use of graphene in electronics. The high electron mobility in graphene could have applications in radio-frequency analog circuits, for example. The transition from silicon to carbon is still in its infancy, but the possibilities are truly exciting. 

    You are right to point out that that we can anticipate some disruptive IC technoligies from the whole field of nano-technology. These are a bit further from commericalization than the areas I described in my post.

  3. Michael Dunn
    February 15, 2013

    I would question how well analog integration  can work. 

    Sure, there can be highly integrated chips dedicated to specific functions – medical AFEs come to mind – video encoders – I'm sure there are others.

    And analog FPGAs  already exist to a limited degree. But these will never take the place of optimized designs built out of simpler components. A programmable or highly integrated part can't match the quality, cost, and specificity of a custom design. 

    Programmable analog can be great in small doses – glue  on a larger chip, like on SmartFusion parts. 

    Some integration could be useful, like combining converters with filters and amps. And as has been written about here, combining digital to create fusion chips with greater price:performance has been taking place since the first ∑∆ converters.

    I don't see much scope beyond these examples, but what do I know?

  4. Michael Dunn
    February 15, 2013

    Oops, I of course forgot to mention RF. There's some wonderful integration happening in RF subsystems. That  might be a promising area, though even RF is going digital. Still, an RF “FPGA” could be a very useful device.

  5. Charles Razzell
    February 16, 2013


    Analog integration is not only possible, but is essential to make any modern cellular phone a practical reality in the form-factors we have come to expect. Usually, high levels of analog integration are application specific rather than general purpose (i.e, ASSP or ASICs). Over time, circuits that were too demanding for integration are becoming feasible to integrate. For example the signal path in cellular base stations are starting to become amenable to integration. Gordon Moore in his famous 1965 paper commented:

    Integration will not change linear systems as radically
    as digital systems. Still, a considerable degree of integration
    will be achieved with linear circuits. The lack of large-value
    capacitors and inductors is the greatest fundamental limitations
    to integrated electronics in the linear area.

    It is nice to see that even with the limitations viewed from the perspective of 1965, it was clear to the most celebrated cheerleader for digital integration that “a considerable degree of integration will be achieved with linear circuits.” It is also heartening to note that some of the limitations mentioned such as capacitor density are now being successfully addressed by process innovations.

    Field programmable analog would certainly be on my wish list too! However, from my perspective, the greatest progress being made today is in the area of application-specfic integration of analog functions.

  6. amrutah
    February 16, 2013

    eafpres, Charles,

           I came up with a good IEEE paper which is related to the ITRS 2008 update (latest ITRS 2012 update available here) on “Techonology Roadmap for 22nm and Beyond “.  This basically dicsussess about the   ON circuit current and OFF circuit leakage current for the process technologies below 22nm involving Silicon nano-wire, GaAs and Ge transistor. Much higher performance with ultimately low power consumption will be realized.

        For analog integration, other than the density of the capacitance and resistors, EM and Noise will also pose a major bench mark for deciding the technology node…

  7. Charles Razzell
    February 16, 2013


    The paper you mention focuses a lot of attention on building a fast, non-leaky switch at 22nm and below, which is where we've seen a series of technology developments including strained silicon, high-K metal gate transistors, FinFETs and Tri-gate transistors (Intel's technology).  These technologies are helping to sustain Moore's law now and for several generations into the future (down to 5nm?). As you point out, there seems to be a significant body of opinion in the research community that the silicon nanowire transistor could be the next big step to take us further. When you use nanowire, the conduction channel is a cylinder and the gate fully surrounds the wire, which intuitively makes sense as ultimate solution for low leakage.


    What I find interesting is that the techniques cited above that are helping to drive CMOS in to the single-digit nanometer scales are increasingly relying on 3-d techniques to achieve their goals. The use of the 3rd dimension is also proving essential to overcoming capacitor density issue, as is the case for deep-trench MIM caps, for example. So perhaps a common theme in the quest for high integration in both analog and digital domains is the move away from purely planar technology?

  8. Bill_Jaffa
    February 18, 2013

    I see two other challenges in analog integration. First, for those who need top performance in one parameter, and are less concerned (or willing to compromise) about others, an optimized, single-function part is often the best solution. Examples would be extreme accuracy/precision, highest speed, highest resolution, lowest power.  That's why you see so many op amps, ADCs. DACs from a vendor, to meet all these leading-edge performance needs. But the business reality is that these are often low-volume, specialized applications.

    The other issue I see is that as the vendor starts adding more functionality, the part become more application-specific. That's good and bad: some apps have a broad customer base, with lots of small and medium OEMs (example: medical device) so you can come up with, say, and ultrasound AFE and find medical and other customers.

    But other times the customer base is just a few high-volume folks–and your part is good for them, but not much good for others apps (some of the highly integrated auto ICs, for example). So while you hope to get one of of those high-volume design-ins, if you don't, there aren't many other places you can sell that part–and so it is a loser.

  9. Charles Razzell
    February 18, 2013


    I couldn't agree more. With higher levels of integration, this becomes a higher stakes game: one where the responsibility of the product definer to *really* understand the customer requirements is more critical than ever!

    While the risks are clearly greater, so also is the opportunity to create signicant value and get rewarded accordingly.

    Nevertheless, a sensible approach is to maintain a balanced portfolio of high performance building blocks and high integration ICs.


  10. Comfortable
    February 22, 2013

    I've looked into FPAA before and decided it is a non-starter for two reasons:

    1.  Whereas digital is algorithmic, analog is physical.  it is easy and valuable to make an IC solution where the algorithm can be changed on the fly.  Hence an FPGA.  But changing something physical cannot be done on the fly.  The bottom line is that programmable analog will only work where the analog algorithm is alterable (i.e. gain of an amplifier, voltage for a reference, Cypress type parts, etc.).  When one starts talking about programming fundamentals like SNR, power capability, bandwidth, etc.; these are dominantly physical problems and are not “programmable” in the same since as digital programmable (i.e. 1 Hz clock vs. 1GHz clock – 180dB dynamic range of a system parameter).

    2.  The engineering world always tends to higher levels of abstraction where solutions can be enabled much quicker with lesser skilled talent.  That's the fundamental behind a digital FPGA.  From a business perspective, companies really don't want a transistor level FPAA where, arguably, it would take a more talented than average analog IC designer to implement a solution.  Rather they want a GUI based solution that can be drawn quickly on a screen with some pop-up windows to fill in the blanks.  Think NI and how they do test with LabView or Cypress with pSOC.

    If one looks at the financial overviews published by brokerage houses that cover analog semiconductors, one will see that the eco system is broken down into data converters, amplifiers, and power.  Power will never be solely algorithmic over 180dB or anything close.  But one could envision data converters evolving to manage the analog interface without amplifiers.  Summarily, if there is ever a widely adopted programmable analog FPAA, it will most likely be built with numerous data converters at the I/Os and a digital FPGA fabric designed with a GUI and pop ups by non-IC,or even non-analog, designers.


  11. Charles Razzell
    February 22, 2013


    What you say makes a great deal of sense, especially considering that routing in digital is not usually detrimental to performance (as long as timing closure is met), whereas anything but the most direct and short signal path in analog is detrimental to noise figure, cross-talk etc.


  12. Brad Albing
    March 20, 2013

    MD – thanks for the perfect lead-in to this:

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.