Monday, 13 June 2016

Doesn't using resistors serial to LEDs all the time waste a lot of energy?


I was wondering for a long time why in schemes which use LEDs for lighting it's very common to put a resistor to go with the LED, and finally it seems like the answer in this question explains why. (It's the easiest way to control the current through he LED to prevent LED from burning up.)


But still, isn't this a big problem? Don't those resistors waste a lot of power and isn't there really any other practical solution?


Is there a reasonable calculation that could provide some numbers to show just how much power is lost to heat from resistors in a typical lighting application? It seems according to the answers that the power loss is likely so small that it doesn't matter. How can one show this using the real numbers?




Answer



You wanted a calculation. Here is the basic form of the calculation.


A typical red LED has a forward voltage drop of 1.8 V, and a maximum continuous current of around 20 mA.


Now what's our voltage? Lets say we want to use a 3 V source.


So we will have a voltage drop of 3.0 V - 1.8 V = 1.2 V over our resistor. The current through the resistor will be 20 mA, so our power is 1.2 V * 20 mA = 24 mW. That is not really a lot of power, although it is a significant fraction of power consumption of the LED. The LED itself uses 1.8V * 20mA = 36 mW.


No comments:

Post a Comment

arduino - Can I use TI's cc2541 BLE as micro controller to perform operations/ processing instead of ATmega328P AU to save cost?

I am using arduino pro mini (which contains Atmega328p AU ) along with cc2541(HM-10) to process and transfer data over BLE to smartphone. I...