I'm designing a switched-mode power supply. I need to give my users the ability to set a voltage setpoint, with a range of approximately 200-800V, and at least 4V of resolution. This needs to be a hardware solution that feeds into my microcontroller. Cheap and simple and hard for the user to screw up are the goals. My predecessors have used potentiometers, which make it impossible to know what you've actually set without running the system. I don't have a display to work with, just a couple blinky lights. Not really the most effective way to identify a voltage like this.
I'm thinking of using rotary switches. Three rotary switches, and an appropriate combination of resistors, should let me translate the setting of the switches to an analog voltage. In my ideal world, 000 translates to no voltage, 999 translates to 3.3V, and every setting in between scales linearly. Alternately, I could use twelve digital inputs to read the same information, but I'll need an I/O expander for that.
I'm sure I'm not the first person to consider something like this. Is there a canonical way of doing this? If I go the analog route, how many resistors am I going to need? How many different values? Or is there an obviously better way to address this problem?
Answer
If you wire all three switches to give a single analog output, you require 0.1% resolution. Since common precision resistors are 1%, and ADCs typically have a couple of counts of uncertaintly, I don't think this technique is practical.
However, if you wire each switch independently to an ADC input, with resistors selected to give evenly spaced voltages, it would be much easier to get reliable readings, and would even work with 5% resistors.
You could use three ADC inputs, or a single ADC with analog switches to read the input switches one at a time.
No comments:
Post a Comment