There's something about the LED formula that I don't understand.
It says that if I have one LED with a forward voltage of 3V and a forward current of 20mA then if I want to drive my circuit with a 6V battery, the formula to determine the resistor's value for the LED is:
$$\frac{(6-3)V}{0.02A} = 150\Omega$$
In a circuit with the elements connected in series the amperage is the same anywhere.
So by using the formula differently:
$$\frac{6V}{150\Omega} = 0.04A$$
So does the LED is actually driven by 0.04A?
The formula is of course correct, but I don't understand where my logic is mistaken. Can someone help me?
Answer
If you applied less than 3 volts across the LED, virtually no current would flow so, it has to be assumed that there is about 3 volts across the LED and that this causes a current of about 20m to flow. If there is 3 volts across the LED and the power supply is 6 volts, there MUST be 3 volts across the resistor. Given that there is expected to be 20mA flowing, it's a simple case of ohms law to calculate R: -
R = 3 volts / 20mA = 150 ohms.
Take a look how a 2V LED might conduct: -
This is just a picture I took from the web. Below about 1.7 volts it hardly conducts any current and at 2 volts it's taking 20mA. As you can see, you can almost assume that at approximately 2 volts the current could be anywhere between 5mA and 45mA. For this particular LED, from a 6 volt supply, there would be 4 volts across the resistor with about 20mA flowing and this would lead to a resistor value of 200 ohms.
Below is another way of looking at how the series resistor alters the overall impedance of the circuit. Again, this is for a 2 volt LED put in series with a 100 ohm resistor: -
With 100 ohms in series the net resistance of the two components dictates that at about 4 volts applied, the current is about 17mA.
No comments:
Post a Comment