I've researched and it says that resistors limit the current flowing through the LED.
But this statement confuses me because we know that in a series circuit, the current is constant at every point, so how come a resistor can limit the current flowing?
Answer
LEDs have a fairly constant voltage across them, like 2.2V for a red LED, which only slightly rises with current. If you supply 3V to this LED without series resistor the LED will try to set for a voltage/current combination for this 3V. There's no current that goes with this kind of voltage, theoretically it would be 10s, maybe 100s of amperes, which would destroy the LED. And that's exactly what happens if your power supply can supply enough current.
So the solution is a series resistor. If your LED needs 20mA you can calculate for the red LED in the example
\$ R = \dfrac{\Delta V}{I} = \dfrac{3V - 2.2V}{20mA} = 40 \Omega\$
You may think that supplying 2.2V directly will also work, but that's not true. The slightest difference in LED or supply voltage may cause the LED to light very dim, very bright, or even destroy. A series resistor will ensure that slight differences in voltage have only a minor effect on the LED's current, provided that the voltage drop across the resistor is large enough.
No comments:
Post a Comment