Recently I posted an answer mentioning the very classic "lithium batteries like partial discharges, so design your system for limited depth of discharge". But then I wondered: with partial discharges, the number of charge/discharge cycles increases as well for the same energy delivered, so the gain in available cycles lifetime will decrease. For example, the battery of a cellphone discharged at 50% in the morning, recharged, discharged at 50% in the afternoon and recharged overnight requires twice as many cycles as a cellphone discharged at 100% and recharged once a day to last as long. I thought it would be interesting to look into that.
I went ahead and as usual, I submit my findings for any SE user's approval and welcome anyone to add to it.
I should point out this only covers regularly used batteries, not those sitting on a shelf for periods greater than a few days. Even so, they do tend to age independently on the cycles but I do not have data on that - perhaps experts could shed some light on that.
Answer
My quick look into it:
The lifetime of lithium batteries decreases with the depth of discharge, looking like the following (this curve is for lead-acid batteries, but Lithium is stated as following a similar curve):
(source)
If the 100% DoD value is taken as a reference, one can plot what I call the "isoenergy" curve (I gave it a 2sec thought) which is basically how many cycles are required from the battery to deliver the same amount of energy as 100% discharges over its entire lifetime: $$isoenergy(DoD)=\frac{100\%DoDlifetimecycles}{DoD}$$ For example, 50% DoD require twice as many cycles as 100% DoD, 25% four times etc.
The results with this particular example:
Conclusion, it still holds that the depth of discharge should be minimised as much as possible.
No comments:
Post a Comment