For an enclosure, I will control the power at the terminals of a panel mount AC socket by using an SSR. The AC mains socket will have this indicator lamp in parallel to indicate whether the socket is live/powered. The indicator lamp according to the datasheet draws only 3mA current at 230V AC.
Below is the diagram of the system:
As shown above the system worked but the lamp was extremely dimmed when the relay was ON. Then I connected a voltmeter across it(across A and B) and nothing changed. Except I saw 230V AC even relay is OFF due to capacitive coupling.
But when I set the voltmeter to low impedance setting, the indicator started to glow as it supposed to. And the fake ghost voltage at OFF times also disappeared.
So I went back to the SSR’s datasheet and at its output characteristics it says “Min. Load current to maintain on: 50mA”
My conclusion was the problem is: if the load draws less than 50mA I will have problem; either with the indicator lamp or any such weak load.
To solve this issue can I connect a resistor across the node A and B in my diagram which draws 50mA current. To make sure the weak loads like the lamp would work fine when the SSR is ON.
If I use a resistor across A and B for 50mA the minimum resistor becomes 1000*(230/50) around 47k. And power for such resistor is 1.25 Watt.
I never used a resistor for such purpose for AC power application. Can any 3W 47k resistor be used? Do I need any heatsink? The relay might be ON more than six months. Can this be a solution or what else can be done?
edit:
Answer
The problem with the light being on is not the minimum current, it's the "leakage current" caused by the snubber network (a resistor in series with a capacitor) in the SSR. That is 10mA according to the datasheet, implying an impedance of about 23K assuming they are using 230VAC as the test voltage. Your solution idea is sound, however the numbers may be off.
There is no "guaranteed off" voltage specified for the lamp, but if we assume 10% voltage make it dim enough you would need to use a resistor that would draw around 100mA when the output is on, wasting 23 watts. That seems pretty wasteful to me. Maybe if it only rarely switches on for a brief time it might be acceptable but usually not (and your case of it being almost continuously on is such a case). You could buy a 50W chassis mount resistor (eg. Dale) but the wastefulness is not very elegant.
On the other hand, if the lamp inside circuit is as I suspect the actual 'leakage' into a diode load may be much less than 10mA, so a higher value resistor may suffice, you would have to test that. You might want to try a 47K resistor to see if it will work. The 5 or 10W wirewound "cement" type is suitable and does not require an external heatsink.
Another suggestion would be to use an indicator that would be driven by the input voltage to the SSR if possible (the one you have is not compatible with the low DC input voltage).
Alternatively you could use a capacitor that is rated for cross line voltage in series with a wirewound resistor instead of the resistor. That would draw current without wasting much power. The capacitor would be physically large, like a motor run capacitor, maybe a few uF in series with 100 ohms.
No comments:
Post a Comment