I want to light up an LED with 220 VAC using least components. These two circuits come to mind:
R1, R2 will be around 200K - 300K depending upon the brightness that I will need. Not much brightness is required so I can even go higher if I am able to get some amount of light out of the LED.
Which one (if any) will work?
(I am not much concerned with efficiency as these will be used as indicators when a high power device is turned on. For ex - a geyser. This circuit will be wired in parallel with the geyser to accomplish that. If I use 200K resistor, I'll be using up around 0.25 watts which will be negligible as compared to 1000 watts being consumed by the main device.)
Answer
Both are bad efficiency-wise but the second one is playing with fire. If D12 has a leakage current which is comparable or higher than D11, D11 will drop half of the mains voltage or more and probably will get damaged.
If you really insist on using the second circuit (as it has somewhat better efficiency and lets you use a smaller resistor), put both diodes in it:
simulate this circuit – Schematic created using CircuitLab
EDIT:
Now that you got me thinking, another circuit comes to mind if using a small capacitor is OK for you:
This one adds some capacitive load to your mains, but doesn't dissipate any significant power itself, so technically it is more energy-efficient. Note that FakeMoustache has a better version of this circuit in his answer, mine is more of a concept. That resistor he has is not needed in continuous operation, but it protects the circuit from inrush current at startup.
No comments:
Post a Comment