I was trying to use a DM-4100A multimeter to verify a power supply, and I'm not sure I'm using the multimeter correctly. The multimeter is >10 years old too, so maybe it is not working correctly.
When I set the meter to measure AC voltage, with a 500V or 200V max, it reports 170-175V. I was expected more around 120V (I am in the USA). Is 170V an expected voltage, or would that suggest I'm doing it wrong?
UPDATE: It sounds like it may be reporting peak voltage instead of RMS. The multimeter does not seem to have an RMS setting. Can I just take the voltage and divide by 1.414 to get the RMS voltage? I want to do this for the purposes of verifying whether power adapter is still producing the correct voltage.
Answer
The manual you linked to, on page 3 says "Average responding, calibrated in RMS of sine wave", so it should not display the peak value of the AC voltage. I would not trust it, or "adjust the reading to RMS" by dividing by 1.4.
I suggest you get another meter, as this one appears to be broken in some way.
No comments:
Post a Comment