During the 19's, Schrödinger state that it is impossible to make a measurement without disturbing a system. To illustrate his statement, he made the famous experiment of the cat and the box. Poor cat btw..
This is also true in electronic nowadays. A multimeter is use in parallel of a load for a voltage measurement. This "add on" will change the equivalent resistor value and then change the "true" voltage value. This is also true for current or other kind of measurement.
How is it possible to compensate this lack of precision? I don't need to be that accurate, but I ask that question by pure curiosity
Answer
If circuit has a impedance of 10 ohm, and the multimeter is 10Mohm, then the change is 0.0001%.
With the impedance of the circuit I mean the load, the power source, and everything combined. Together they determine the impedance of the circuit.
Sometimes the difference can be calculated, but not every circuit has a linear transfer. Sometimes is makes a big difference, for example with high voltage and low current applications and with current measurement. So yes, you have to be aware of it.
For example a Geiger counter with a Geiger tube that needs 400V with a very low current. Suppose the voltage measured with a multimeter is 400V, but after the multimeter is removed, the voltage may raise to 450V without you knowing it. One solution is to have a voltage divider with two resistors always connected to the high voltage. Once calibrated, the low voltage of the voltage divider can be used to calculate the actual voltage.
To do a current measurement the right way is a difficult subject on its own. For example, the wires themself could have influence. Do you know about 4-wire shunts ? Measuring high frequency current is very hard. And so on.
No comments:
Post a Comment