I have a simple circuit:
+12V -- R1 -- LED1 -- LED2 -- LED3 -- ground
If the Forward Voltage of an LED is 3V, and the Forward Current is 20mA, I can (I believe) calculate the required resistance of the resistor as (12V - (3 * 3V)) / 0.02A = 150Ω.
From what I understand, that should give me a Voltage Drop of 3V over the resistor and each LED respectively, and a current of 20mA through the circuit - perfect.
In the simulation of this circuit, I get a Voltage Drop of 4.01V, 2.66V, 2.66V, 2.66V respectively, and a current of 26.74mA through the circuit, which is too high for the LEDs.
This makes me think that I don't understand the relationship between Forward Voltage and Voltage Drop, and therefore, how am I supposed to calculate a correct resistor value that won't burn out the LEDs?
Apologies if this is asked a lot or is really simple, but I've been searching for ages and haven't come up with anything.
Answer
For your simulation, you specify the led's forward voltage as "3V at 1A". This means the forward voltage of the leds at about 20mA will be much lower.
Everything is right there, you just need to read the leds datasheet to find the forward voltage at arount 20mA.
No comments:
Post a Comment