Advertisement

Blog

(Dis)Integrating Power Consumption, Counterintuitively

The present mobile era may be best characterized by its unrelenting pressure to reduce power consumption. Each generation of consumer electronics must offer more functionality while maintaining or extending runtime. Initially, efficiency improvements were the Holy Grail, at least from a power management viewpoint, but the process of engineering efficiency into a device requires more than an efficient power supply. In fact, power supply efficiency improvements must, by definition, have diminishing returns (as they approach 100 percent).

A more comprehensive approach improves system efficiency by focusing on the load as much as the power supply. In a basic example, a linear regulator is added to a simple battery-powered device — power supply efficiency actually goes down, but battery life is prolonged, and system efficiency goes up. Why? Linear regulator efficiency is essentially VOUT /VIN . (With no regulator, power supply efficiency is 100 percent.) However, using a linear regulator increases battery life. This is because system current (usually dominated by clocked CMOS logic) is less when the operating voltage is held at the minimum acceptable value by the regulator, rather than being allowed to rise with battery voltage. Total heat emitted by the device goes down, as well. This illustrates how system-level thinking provides better efficiency than focusing only on specs.

Integration can effectively enable system-level improvement in power consumption. Yes, it is possible to integrate multiple functions while doing nothing to reduce power. You can just throw two ICs on to one piece of silicon with no system optimizations and be done with it. That may help cost and size but doesn't accomplish much else. It's pretty rare for mixed-signal devices to be designed this way. Usually, more is expected for the effort.

Today efforts to save power are more intertwined with the system and require higher levels of integration. A good example is shutting off system blocks that are not used, sometimes only for a few milliseconds. This often brings constraints on sequencing and ramp rates. It's possible for an apps processor or controller to manage this, but it's not that practical. It's better if integrated power management does it, particularly when powering up from a dead state where system brains may not yet be awake.

The use of on-chip power MOSFET switches provides another counterintuitive example of using integration to improve system efficiency. The general thinking is that external FETs provide lower on-resistance and hence better performance, but a system view provides a clearer and sometimes different answer. Off-chip MOSFETs require on-chip drivers. If these are designed for a range of external devices (a common practice to allow multiple FET vendors), then the drivers must be overdesigned. These drivers will use more power than might otherwise be needed, and they can be especially wasteful at the high switching frequencies needed to keep passive components small. When driver and power MOSFETs are integrated together, the FET is precisely known, so the driver can be optimized to a degree not possible when driving an external switch.

Mobile device designs frequently include current monitoring for fault detection, thermal control, battery gauging, or other goals. Without integration, when different system blocks need current information, they often must measure it separately. If current-sense resistors are used, each of these creates a separate loss. An integrated power management device minimizes power loss by making only one current measurement and passing that to all the blocks that need it. Also, when implemented with integrated power MOSFETs, sense-FET structures can eliminate sense resistors. Then power is switched and current is monitored in a single device, further reducing losses.

Every day, new designs emerge that leverage integration to reduce heat and size while extending battery life. It's safe to say that, until some form of free portable energy is invented, a primary goal and benefit of integration will continue to be power reduction. Let us know your thoughts on this design philosophy, and share your design experiences.

17 comments on “(Dis)Integrating Power Consumption, Counterintuitively

  1. DEREK.KOONCE
    May 16, 2013

    So true as to the potential waste in power by having a generic MOSFET driver controlling a MOSFET. Back when I was with Vishay and the DrMOS product was being developed, the idea of a generic MOSFET driver had to be tossed and sizing the driver for the MOSFET to be used was critical – especially in a PC core supply. Here, the top MOSFET iis typically 4 times more in on resistance versus the synchronous MOSFET due to the short duty cyle requirement of stepping down from 12 Volts to the core's 1 Volt.

  2. DEREK.KOONCE
    May 16, 2013

    Also during this time, power supply efficiency was moving from 96% to 97% and even 98%. Many questions arise as to the cost-benefit factor when trying to add another 1% or 1/2% above the 98% efficiency value. Especially when it comes to space and lower power wattage.

  3. bjcoppa
    May 16, 2013

    Silicon carbide and gallium nitride based power MOSFETs are on the rise due to enhanced performance criteria that can be achieved over GaAs and Si wafer based devices. The benefits of SiC have been known for decades but now the supply chain is starting to increase as this material becomes more competitive from a cost perspective. For very high voltages over 500V, it can even be the primary material of choice due to its high breakdown voltage. Look for a post on this topic soon in Planet Analog.

  4. Davidled
    May 16, 2013

    Generally speaking, MOSFET has a lot of power dissipation which lost a lot of energy.   With controlling duty cycle of PWN, charging pump might be supplemented to increase power efficiency. Secondly, I am unclear many questions you comments to increase 98% from 96 %.  2% deviation might be occurred in the ambient temperature, as efficiency is always affected by temperature.  

  5. Brad Albing
    May 17, 2013

    I know the SiGe devices can have quite high breakdown voltages. But how well do they do regarding lower voltage devices regarding Derek's previous comment? I.e., since you want a very low duty cycle to buck down the intermediate bus voltage to sub 1VDC, the upper FET must be a fast device. To be fast, the devices are typically smaller (all other thngs being equal which I acknowledge is a sloppy generalization). And when they are smaller, R-ds-on rises, which promptly mucks up the efficiency you were trying to achieve. So – is it better with SiGe?

  6. Scott Elder
    May 17, 2013

    “Silicon carbide and gallium nitride based power MOSFETs are on the rise due to enhanced performance criteria”

    I agree here.  The value of going from 96% to 97% efficiency is overstated.  Better that the huge loads run directly offline (using for example GaN) and bypass an interim supply rail.  This is definitely the way to go for say LED backlight drivers for TVs or PCs.

    Granted, the fears of routing 250V around the PCB need to be overcome, but this is a better place to derive measureable efficiency value.

     

     

  7. bjcoppa
    May 17, 2013

    SiGe is implemented more for its high-frequency benefits due to the mobility enhancement achieved due to strain effects over unstrained silicon. Breakdown voltage is more critical for high-power devices on the order of several hundred volts where SiC is more advantageous. However, SiC is more expensive on a unit basis than SiGe but that is less critical for military applications and low volumes.

  8. Brad Albing
    May 21, 2013

    Derek – I went thru some of the same issues when dealing with customers while we were trying to help them decide whether to use an integrated FET buck switcher or use a controller IC plus external FETs.

  9. Brad Albing
    May 21, 2013

    That extra 1% or 0.5% is sometimes not especially critical – unless you're the Marketing guy trying to sell product – or the customer who's trying to pit one supplier against the other and negotiate a price advantage based on that 0.5% delta.

  10. Brad Albing
    May 21, 2013

    Well, besides the fear of running the 250V around, you may also incur additional cost with regard to UL certification issues. Which I acknowledge is part of your NRE, hence one-time, amortized over all units produced (and previously discussed here). Slight additional cost incurred due to larger PC board to get more trace separation or to accomodate oddball routing to maintain spacing/creepage distance. But that's another blog….

  11. Dirceu
    May 22, 2013

    “Today efforts to save power are more intertwined with the system and require higher levels of integration. A good example is shutting off system blocks that are not used, sometimes only for a few milliseconds. This often brings constraints on sequencing and ramp rates.”

       If you observe the competition between different processor on the market, will see that consumption uW/MHz in active mode is very close. Instead, are offered multiple low-power modes that, in combination with appropriate software strategy (duty cycle management) lead to a lower average current. But then comes the issue of reaction time from sleep modes. In general, the answer given by the manufacturers is to reduce interrupt latency.

  12. Brad Albing
    May 23, 2013

    >>But then comes the issue of reaction time from sleep modes. In general, the answer given by the manufacturers is to reduce interrupt latency.

    Dirceu – this goes beyond my ken as an analog design engineer – what is interupt latency and how does that affect power draw?

  13. Dirceu
    May 23, 2013

    Brad,

        is not about the instantaneous current, but the reduction of average current. I explain: In the past the idea was to reduce power consumption running the processor in slower possible clock in order to execute a given task. With the emergence of low-power modes, now the concept has been slightly changed. Suppose a system uses one of the low-power modes to stay “sleeping” most of the time and only “wake up” when a task must be performed (periodically or upon external request). So, now you want to have a processor with a fast clock performing the task in the shortest time (note the success of ARM Cortex-M0 and M0+ cores). The figure below illustrates these ideas. Note that with fixed “Run” and “Sleep” currents, a lower average current will be achieved through a smaller duty cycle. Course, it was something simplified. In practice there are several distinct power levels (for example, turning an on-board radio from RX to TX mode. I discussed this in more detail in my presentation to the ESC Brazil 2011 (Introduction to Low Power RF with ARM Cortex-M0 ). Generally the processor “awake” by an interrupt request , such as a timer or a voltage transition on pin. Thus, the processor will attend the task more quickly (low Iavg ) if it has a low interrupt latency – the time interval between the interrupt request and the execution of the first instruction on the ISR.

      In time:  I hope the author does not care too much about this slight change from analog to digital context.

     

    Average current

  14. Brad Albing
    May 23, 2013

    OK, I understand. That's a good explanation regarding the duty cycle and overall power draw. Thanks.

  15. WKetel
    June 20, 2013

    Wit all of the comments I saw no reference to any method for halting those background applications that get stuck into many portable devices, which keep running and wasting power even while not being used. Some method of reporting that an application program is still running, along with a method of stopping it from running, could result in improved battery charge life while not needing any new design improvements. Sometimes it is the invisible things that must be examined closely.

  16. Brad Albing
    June 20, 2013

    So, we need an app that can run in the background and see if there are any apps running in the background that aren't needed. That should help. Um, wait – maybe I just made things worse….

  17. WKetel
    June 20, 2013

    Brad, No, actuaqlly what we need is something to run in the forground occasionally, to do the check. My issue is with those programs that do run hidden in the background.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.