If I had an LED that had a forward voltage of 2.2 volts with a forward current of 20mA and I somehow created a battery that was exactly 2.2 volts and I put that LED, and only that LED, on that battery, what would the current be?
I'm just confused how diodes/LED's work. I understand usually you would have say a 5V battery and the resistor that you put in series with the LED would control the current. I'm just trying to better grasp LED characteristics.
Thanks
Answer
In theory, this would work, and you could get 20mA. However, this is a very fragile system that you describe. If something shifts slightly, you won't get your desired current. For example, you would need to control/know the following:
- The temperature that the diode operates at, possibly accounting for self-heating
- The exact voltage that a diode draws 20mA at your given temperature (note that the datasheet will probably give a "nominal" value or a tolerance - you would need to know the exact voltage.
- Your power supply (battery) would need to be much more precise than is practical for just driving a LED
The problem is that diodes change their current dramatically with a very small change in voltage. This can be seen in the Shockley diode equation:
$$\Large I=I_s ( e^{\frac{V}{n V_T}}-1) $$
This shows that the current (I) varies exponentially with the applied voltage (V). So while it's possible to apply a fixed voltage to a diode and get a precise current, it's hard. Diode current is relatively easy to control in current mode, as you can make a rough current source with a resistor and enough voltage headroom. This is what is happening when you have a resistor in series with your diode at 5V. An alternative is a constant current sink, which is easy to do on an IC. These show up as LED driver chips that can sink a programmed current, and they work well too.
No comments:
Post a Comment