I have had a twilight switch lying in my basement for 5 years after it was used in my garden for about 5 years. This week I took it apart to convert it for (solar) use from 230VAC to 12VDC.
What is left (for sake of this discussion) is a resistive bridge as follows:
simulate this circuit – Schematic created using CircuitLab
What supprised me is the threshold voltage V(OUT) was about 200mV at a given twilight level, where I would have expected something somewhat close to \$V_{REF}=4\text{V}\times \frac{33\text{k}\Omega}{33\text{k}\Omega+47\text{k}\Omega}=1.65\text{V}\$. The LDR has a resistance of about 660k at intended light level.
I am thinking of changing R1 to 470k. That should give me a more reasonable V(OUT), somewhere half way the supply voltage as it was once intended considering R2/R3. I pulled the datasheet of the OPAMP and I think I may just get away with input bias and input offset currents, but what is wisdom?
- Indeed increase R1 or will it age again a lot like it did (about factor 3 at the intended region);
- Replace the LDR with a new one, for which I will have to reconsider R1 anyway;
- Replace it with a photodiode or a phototransistor, but I would have to research what the circuit would look like.
- Am I overlooking something?
- What is the cause that of the huge difference in expected voltage and actual voltage, or differently put: Do LDR's age and if yes, how?
For a more complete circuit diagram refer to Reverse engineering: Relay driver design consideration. Only the high voltage capacitive power supply part is left out there.
No comments:
Post a Comment