I apply a constant 4ma current I to a resistor R and I measure a voltage V across.
so the resistor \$ R = \dfrac{V}{I}\$.
But when I apply \$ I_2=20ma\$ to the same resistor R and I obtain the voltage \$V_2\$ then the resistor \$R= \dfrac{V_2}{I_2}\$ I expect the same.
but in first case for 4ma, I obtain 248.2 ohm and in 20ma case I obtain 248.7 ohm.
I can only measure the voltages with a daq box. If I have a device with a current loop output how can I then translate the voltages to currents, since I find different resistances for different currents? is there a way to obtain accuracy error? is there a standard for that?
Answer
Perhaps check the temperature coefficient of the resistor. With 20mA, the power dissipation in the resistor is 25x larger than with 4mA (power dissipation is proportional to I2). The resistor heats up as you increase your current. As it heats up, its resistance changes. Incorporating the temperature coefficient would improve your accuracy. Another option is to use a much smaller resistor so that the temperature change over the current range is small.
No comments:
Post a Comment