I was curious how this is done. It seems you would have to vary the voltage and current to measure resistance from 1ohm to 10Mohm.
Answer
The multimeter does exactly the same as what you would do manually with a non-autoranging meter. Suppose you have a 3 1/2 digit meter, so 1999 is your maximum reading.
The multimeter starts at the highest(*) range, and if the reading is less than the 199.9 threshold it switches to 1 decade lower, and repeats this until the reading is between 200 and 1999. That goes very fast because it doesn't have to display anything during this procedure, so it appears that it gets the right range first time.
Or, if it includes enough logic, it can take the first measurement on the highest range, and then directly select the lower range that is most appropriate for that voltage level.
For example:
1st reading, at 1.999 MΩ range: < 199.9
2nd reading, at 199.9 kΩ range: < 199.9
3rd reading, at 19.99 kΩ range: > 199.9
So this is the range we want.
Do actual measurement: 472
That value is between 200 and 1999, so that's the best resolution possible. If it would go another decade lower it would overflow. So the resistance is 4.72 kΩ.
Note that during the first readings it doesn't really measure the actual resistance, it just checks if it's higher or lower than 199.9.
Alternatively the multimeter may have a set of comparators that can all work simultaneously, each checking a next higher range. You get the result faster, but this requires more hardware and will probably only be done in more expensive meters.
(*) Not the lowest, as "Mary" aka TS suggested. Those as old as I am have worked with analog multimeters. If you would start measuring at the most sensitive range the needle would hit the right stop hard. You could hear it say "Ouch". Switch to the next position, again "bang!". If you care for your multimeter as a good housefather ("bonus pater familias") you start at the least sensitive range.
No comments:
Post a Comment