I've been told that many kinds of batteries work best if they are used until they are completely drained, and then recharged.
(Edit: The "memory effect myth" is quite widespread. Batteries work just as well if they are "topped up" every time.)
Right now I design devices the standard way: try to use the battery power as long as possible, and then when there is no power left it doesn't work at all.
If there was some simple way of implementing the following, I might use it from now on:
- Two independent batteries (or more)
- Once you start using one battery, continue to use it until it is
completely drainedreaches the manufacturer-recommended minimum voltage. - When the "in use" battery is completely dead, switch to using the next battery. Presumably you use some technique similar to the "Switch between 5V power supplies?" question.
- Once you switch to the last battery, go into some sort of low-power mode so it can still do the important things in "limp mode", but the user is notified to plug it into a charger as soon as possible.
- After plugging into the charger, go back to standard mode -- but if disconnected from the charger before the batteries are fully charged, go back to limp mode.
- (optional) When plugged into the charger, only charge the drained battery (or batteries) and keep them topped off; leave the one "in use" battery alone.
- (optional) keep track of precisely how much total energy could be extracted from each particular battery the most recent time it was fully drained. Use that number (perhaps modified using Peukert's law) to give extremely accurate estimates of future run-time.
Why don't all devices do this?
- cell phones: first battery: talk like crazy. last battery: only emergency calls.
- laptops: last battery: throttle back to a slow speed that is adequate for looking up static documents
- handheld GPS: last battery: try to reduce energy by updating the screen less often, dimming the backlight, etc.
Answer
Why don't all devices use this? It adds cost and complexity. Is their any other reason for not doing something?
Seriously, I'd say that there are plenty of options and implementations for this. Having two equal batteries doesn't make much sense, so often the second is used for emergency or limp-home power. For instance, your PC has a RAM retaining battery on the motherboard for when you loose power. A laptop often gives a "Low battery" warning, at which time you're welcome to reduce power however you can.
I think that your statement that 'batteries work best if they are used until they are completely drained, and then recharged.' is a little broad. This is more the case for Nickel-based (NiCd and, to a lesser extent, NiMH) chemistries. Lithium Ion cells don't suffer this memory problem. In fact, their lifetime improves if you avoid deep discharges. See this page from BatteryUniversity.com for reference.
There are a couple of options for doing more intelligent power management in your own devices.
The simplest is an ORing diode on the power supply. If all you want is a hot-swappable power supply and you have a bit of leeway for your inputs, you can connect backup battery to the anode of a diode, and connect the cathode to your main battery. When the voltage of the main battery dips to 0.7V less than your backup (Or is removed), the other battery kicks in. Be careful of leakage current into the backup battery, it might overcharge it.
Alternatively, you can use a power mux IC like the TPS110. This lets you select your input independently (or dependently, if you prefer) of the input voltages, instead of always using the higher supply.
Finally, Linear Technology incorporates what they call "PowerPath" controllers into their battery charging ICs. I've used their LTC4011 which seamlessly transitions between battery and external power, and charges the battery while running off of the external power.
No comments:
Post a Comment