Friday, 29 September 2017

Calibration of reading of current measuring cicuit


Ok ,I'm making a small current meter device
I was able to take the voltage coming from a "shunt resistance " which is a current transducer that gives me an specific analog signal that proportional to the current passing in it , I take this signal into ADC , so I facing a problem of calibration


My question is : -The available methods to calibrate the reading -Is there some methods that I can make it programmatically without the need for a reference(external device ) or human interference ?


My goal in accuracy for now is 1% in readings . I'm measuring AC current. Range from 0.5 A to 80 A . Heat is not a problem .




Answer



You should be able to compute the scaling factor. The resistor will make volts from the current according to Ohm's law. After that you should know what gain you have into the A/D and what range the A/D is using.



For 1%, you probably do need to do some calibration. A large enough known voltage source with known resistor will give you a current. You can make the current as accurate as the resistor and your ability to measure the voltage accross it. With a 1/2 % resistor and any reasonable voltmeter (has to be good to 1/2 % minimum), you can know the current to 1%, then store that and the zero reading in EEPROM and correct from those on the fly each reading. Be aware that some of that might drift with temperature, so you want to calibrate at your center temperature or specify a narrow range.


Added:


Component values and amplifier offsets vary over temperature. I was assuming a two point calibration, which can always be mathematically reduced to


OUT = IN*m + b


M is the gain adjustment and B the offset adjustment. Since both gain and offset are functions of temperature, any one set of M and B values is only valid at the particular temperature the measurements were made to derive them. If this calibration temperature is in the middle of your usage range, then the actual temperature will never be more than 1/2 the range off of the temperature the unit was calibrated at. This may possibly be good enough and not require temperature compensation. If instead you set M and B to calibrate the unit at one end of the temperature range, then the actual temperature at usage time could be the full range off from the calibration temperature, making the worst case error higher.


Since you mentioned a A/D, you will have the measured values in digital form. This allows for performing the calibration equation above digitally. This also means the M and B values have to be stored in non-volatile memory somehow. The obvious answer is in the EEPROM of the same processor receiving the A/D readings. Calibrating digitally and storing the calibration constants in EEPROM is cheaper and better than ancient methods like trimpots. Trimpots cost real money, take board space, and themselves drift with time and temperature. On the other hand, most microcontrollers come with non-volatile memory, and usually have enough code space left over to perform the calibration computation at no additional cost. Even if not, using the next larger micro is usually a smaller increment than the cost of adding a trimpot.


As for AC measurements, why do you need them. Current shunts work at DC, so you should be able to calibrate the system at DC unless you have deliberately AC coupled the signal for some reason.


No comments:

Post a Comment

arduino - Can I use TI's cc2541 BLE as micro controller to perform operations/ processing instead of ATmega328P AU to save cost?

I am using arduino pro mini (which contains Atmega328p AU ) along with cc2541(HM-10) to process and transfer data over BLE to smartphone. I...