I was wondering for a long time why in schemes which use LEDs for lighting it's very common to put a resistor to go with the LED, and finally it seems like the answer in this question explains why. (It's the easiest way to control the current through he LED to prevent LED from burning up.)
But still, isn't this a big problem? Don't those resistors waste a lot of power and isn't there really any other practical solution?
Is there a reasonable calculation that could provide some numbers to show just how much power is lost to heat from resistors in a typical lighting application? It seems according to the answers that the power loss is likely so small that it doesn't matter. How can one show this using the real numbers?
Answer
You wanted a calculation. Here is the basic form of the calculation.
A typical red LED has a forward voltage drop of 1.8 V
, and a maximum continuous current of around 20 mA
.
Now what's our voltage? Lets say we want to use a 3 V source.
So we will have a voltage drop of 3.0 V - 1.8 V = 1.2 V
over our resistor. The current through the resistor will be 20 mA
, so our power is 1.2 V * 20 mA = 24 mW
. That is not really a lot of power, although it is a significant fraction of power consumption of the LED. The LED itself uses 1.8V * 20mA = 36 mW.
No comments:
Post a Comment