I have a Lithium battery pack circuit that converts 4.2V (two 4.2V cells in parallel) to 5V. Another option is to use a step-down converter from 8.4V (two 4.2V cells in series) to 5V. Considering both cited circuits are well implemented, which choice would be more efficient in terms of power dissipation?
I am looking for some general rule as "step-down is always preferable than step-up" or "the absolute voltage difference matters" etc.
Answer
Boost converters are typically less efficient than Buck converters, but not by much. The fundamental reason has to do with the inductor current flowing directly to ground during the on-time, instead of through the load, like it does on Buck converters. There is an EE-Times article that mentions this: Unscrambling the power losses in switching boost converters
Generally, however, since inductor current flows to ground during the on time, only a fraction (off time to period ratio) flows to the output, as illustrated by the pulsing currents in Figure 2 (this is the reason why boost converters are generally less power efficient than buck converters)
Since the efficiency differences are not significant, you are probably better served deciding on additional criteria instead of regulator efficiency alone, including:
- Charger complexity (single cell is simpler).
- Cost
- Size
- Cells in series will be limited by their weakest cell.
- Cells in parallel tend to charge one another, and there is an efficiency hit due to the chemical process.
- Losses due to higher input current with lower voltages (parallel cells).
No comments:
Post a Comment