Five Tips for Reducing Light Load Power Consumption
Your new power supply will work efficiently under full load—but how well will it do under light or no load? Here are five tips you should consider.
It can be quite a challenge to meet today’s mandated efficiency requirements for power supplies. Just understanding the requirements is difficult enough, thanks to the dizzying array of initiatives and directives that vary by end equipment, power level, and governing authority. These include Energy Star, the California Energy Commission, and the EU Stand-by Initiative, to name a few. However, after a quick glance at any of these the energy conservation initiatives, it becomes clear that one of the greatest challenges for the power supply designer is to minimize the power loss at light loads and no load. Here are five ways to remove those last few milliwatts from an offline flyback supply.
1. Pick a “green” controller.
The controller chip is the brain of the power supply. Selecting a device that is specifically designed for reducing light load losses is the first critical step to meeting most standby requirements. Luckily, manufacturers of power supply controller chips are answering the call for more energy efficient devices by introducing a new generation of green-mode controllers.
Most of these green-mode flyback controllers are current-mode controlled, so their control signals include information about the amount of loading on the supply’s output. At light loads, the controllers enter a burst-mode of operation. During burst-mode, these controllers will alternate between an ON and an OFF state. During the OFF state, the controller basically goes to sleep and the power components of the supply are left idle (not switching). Because no power is transferred during the OFF state, the output voltage begins to droop. The green-mode controller monitors the output voltage and eventually enters the ON state to replenish the output voltage. Much of the power loss occurs during the ON state, so the ON-OFF duty cycle significantly impacts the overall efficiency. The ON state typically lasts for a few hundred microseconds. The OFF state is dependent on the loading, and can last for ten’s of milliseconds for extremely light loads.
Figure 1: Burst-mode operation results in a low-frequency ripple
One side effect of burst-mode operation is an additional low-frequency ripple voltage on the output. During the ON state, the output contains the typical ripple voltage associated with normal switching of the power supply. However, additional ripple content is superimposed at the burst frequency. This is shown in Figure 1. Because the burst frequency is quite low, it is not practical to attenuate it with an L-C filter. Instead, the low-frequency output voltage deviation is best reduced by increasing the output capacitance.
In addition to burst-mode operation, most green-mode controllers implement other energy-saving features such as reduced quiescent current draw by the controller. Many use quasi-resonant switching to improve efficiency at all load levels. Quasi-resonant flyback supplies use the resonance formed by the transformer leakage inductance and parasitic capacitances to turn on the MOSFET with reduced loss.
2. Minimize loss in start-up resistors.
Most flyback controllers generate their own bias power from an auxiliary winding of the transformer. However, they need some way to get started initially. This has been traditionally accomplished by connecting a resistance from the rectified AC voltage to the VCC pin of the controller. The resistance must be low enough so that the controller has enough current to turn on at the lowest AC input voltage. Making the resistance too small leads to excessive power dissipation and could prevent achieving the desired compliance.
The startup current required by the controller is usually listed near the top of the electrical characteristics table in the datasheet. The latest green-mode controllers have pushed this current down below 50 μA. For a supply that must run over the universal AC input range of 85 V to 265 V, using a 2 MΩ pull-up resistor would guarantee at least 50 μA of startup current at low-line. At the nominal US line voltage of 120 V, where compliance testing is usually required, the resistor dissipates only 13 mW of power. While 13 mW might not break the power budget, at the nominal European line voltage of 230 V, the power loss in the resistor is four times as much. Depending on the application and system loading during standby, 52 mW could be significant.
Figure 2: Controllers with cascode connections to the MOSFET greatly
reduce startup resistor loss.
Some controllers can provide the startup current through a transistor that is switched off after the controller has completed a successful startup sequence. This transistor can be an additional external component, or sometimes is included inside the controller IC. In either case, this additional high-voltage transistor adds cost to what is typically a cost-sensitive product. Also, incorporating the transistor in the same package as the controller can lead to problems with creepage, clearance, and reliability.
A similar approach to handling the startup current is used by controllers that implement a cascode connection to the power MOSFET. This is shown in Figure 2. With the cascode connection a DC voltage is applied to the gate of the MOSFET, while the controller turns the FET on by pulling the source low. The controller is able to use the source connection of the MOSFET to obtain its initial startup current. It does so by operating the MOSFET in a linear mode during startup. No additional high-voltage components are needed, and there are no high-voltage connections to the controller. This approach still requires a pull-up resistor to provide the transistor’s gate voltage, but the gate connection typically requires less than 10 μA.
3. Let it ring.
Snubbing and clamping circuits used on the primary-side MOSFET are another prime area to realize power savings. The very common RCD clamp shown in Figure 3 is used to reduce ringing and prevent overvoltage stress by limiting the voltage spike on the drain of the MOSFET. This voltage spike is caused by energy stored in the transformer’s leakage inductance when the MOSFET turns off and abruptly halts current flow in the primary winding.
Figure 3: Reduce losses by optimizing the clamping circuit.
The first step to reducing both the voltage spike and the loss in the clamp is to design a transformer with minimal leakage inductance. Beyond that, the clamp resistance can be increased to further reduce the loss, but doing so also increases the magnitude of the voltage spike. During the reset portion of the switching cycle, the reflected output voltage is impressed across the clamp resistor leading to extra loss. Using a higher voltage MOSFET, e.g., 800 V instead of 600 V, provides more margin for the voltage spike and allows for a much larger resistor. However, an increased voltage rating results in either a more expensive MOSFET, or one with higher on-resistance which impairs the efficiency at heavier loads. Many times a compromise must be made between cost, light-load efficiency, and nominal-load efficiency. In some supplies designed for 10 W or less, the clamp could be removed entirely, providing significant energy savings. Of course, EMI concerns may limit how much ringing can be allowed on the drain.
What might be less obvious is that decreasing the clamp capacitance also can reduce the light-load loss. When a controller is operating in burst mode, the clamp circuit discharges between ON states. If the clamp capacitor is too large, excess energy is stored and dissipated during the OFF state. In some situations, the clamp capacitor may not fully discharge before the next ON state begins. Setting the time constant of the clamp RC network to around 10 times the switching period is a good general rule for reducing this loss.
Another technique is to replace the RCD clamp with a Zener clamp. A Zener clamp can reduce the loss in the clamp at light loads. However, at heavier loads the Zener clamp can be significantly more dissipative than an RCD clamp.
4. Squeeze milli-Watts out of the secondary regulating circuitry.
When it comes to standby loss, every circuit must be investigated, including the error amplifier that regulates the output. The left side of Figure 4 shows a typical regulating circuit for a 12 V supply. The often-used TL431 requires at least 1 mA of quiescent current to guarantee regulation. This is provided through R2, which typically leads to anywhere from 15 mW to 50 mW of loss. The resistor divider of R3 and R4 set the output voltage. With a series resistance of 12.6 kΩ, these resistors dissipate 11 mW.
Figure 4: Anywhere from 20 mW to 55 mW of loss can be eliminated
from the regulation circuit.
The right side of Figure 4 shows a more efficient way of regulating the output. The TL431 is replaced by the TLV431, which only requires 80 μA of quiescent current to guarantee regulation. The current driven through the optocoupler is sufficient to power the TLV431 so R2 is eliminated. The TLV431 is only rated for 6.3 V maximum, so a “poor man’s linear regulator” circuit comprised of Q1, R5 and D1 protects this device. R5 and D1 add an extra 3 mW of loss. By increasing the resistance of the feedback divider by a factor of 10, an additional 10 mW is saved.
5. Be smart about the bias level.
If you still need to eke out a little more power savings, optimizing the controller’s bias voltage might allow you to achieve your goal. The bias voltage must be high enough to ensure that the controller remains active under all loading conditions. The voltage also must be sufficiently high to enhance the MOSFET when it is applied to the gate. Setting the bias voltage any higher than what is required by the controller and MOSFET just leads to excess loss.
Most green-mode controllers reduce their quiescent current when they operate in burst mode. This lessens their contribution to the losses associated with the bias voltage. Typical quiescent currents drop from 2 – 3 mA during normal operation to 200 – 300 uA during burst operation. This current, specified in the controller datasheets, does not include charging and discharging the gate of the MOSFET. The gate charge power is equal to the product of the bias voltage, gate charge, switching frequency, and the duty cycle of burst mode. Because the gate charge increases with an increasing bias voltage, an unnecessarily high voltage further compounds the loss. Thankfully, burst mode operation prevents the bias losses from becoming overly significant. Minimizing the bias voltage saves around 10 mW to 20 mW in most cases.
Minimizing light-load losses in power supplies requires scrutinizing the power loss in every component. Just a few milliwatts could determine whether a product is Energy Star compliant. Implementing these techniques could shave hundreds of milliwatts from a product’s standby power consumption.
- Download the UCC28610 datasheet and other technical documents here: www.ti.com/ucc28610-ca.
- Kollman, Robert, POWER TIP #17: Snubbing the Flyback Converter, Power Management DesignLine.com, November 9, 2009.
- Mammano, Bob, 2006/07 Power Supply Design Seminar – SEM1700 Topic 1: “Improving Power Supply Efficiency – The Global Perspective,” Texas Instruments 2007.
- Madigan, Michael, 2006/07 Power Supply Design Seminar – SEM1700 Topic 2: “Green-Mode Power by the Milli-Watt,” Texas Instruments 2007.
About the Author
Brian King is an Applications Engineer Texas Instruments and a Member of the Group Technical Staff. Brian is a member of IEEE and holds a BSEE and MSEE from the University of Arkansas. You can reach Brian at firstname.lastname@example.org.
Texas Instruments Inc.