tldr: Can I transmit an analog signal through a noisy environment without it being corrupted, or should I use a digital signal instead?
My requirement is to transmit a signal from a MA3 absolute magnetic encoder 75 feet in the sky to a DAQ module on the ground. This model comes with an analog or digital interface. I would prefer to use the analog, because it is easier to interface with a DAQ module.
With the analog interface, the sensor has 10 bit resolution, 2.6 kSample/s, 0-5V output. To keep linearity of the sensor data around the 0 and 5V boundaries the output load impedance should be greater than 4.7kohms and less than 100pF.
The digital interface uses PWM. It has 10 bits of resolution and has a frequency of approximately 1kHz. The RC oscillator for PWM is only within 10% of its rated pulse period, but this can be accounted for by calculating the ratio of tOn to tOff for each duty cycle. In order to interface this with a DAQ module I would have to use counting logic. I am not a fan of this approach because the counting approach seems error-prone, and it will require learning software for the DAQ. However, I will use it if the analog route is impossible.
For the shape of the signal, I expect it to constantly vibrate slightly from the sensor position changing slightly, but change overall very slowly since the shaft it is monitoring rotates at less than 1 RPM.
There are three complications to this.
- the signal has to pass through a slip ring
- the signal has to travel from a height of 75 feet to the ground - I have heard that long wires act as antennae for EMI and the height of the cable in the sky may affect this
- the wire carrying the signal will be no more than a foot away from wires carrying 3 phases of up to 40V of wild AC
My first question is: based on all of these sources of interference, does the analog signal stand any chance of maintaining integrity when it reaches the DAQ module? I lack the experience with signals to be able to predict whether this is feasible or not.
If you think this is possible, do you have any recommendations for making it work? The only approach I know of is shielding the wire, and keeping it as far away from the power lines as possible.
Thanks for your help
Answer
Can I transmit an analog signal through a noisy environment without it being corrupted, or should I use a digital signal instead?
Yes, you can transmit analog signals long distance without them being corrupted, but proper EMI technique needs to be followed. The usual route is to gain the signal up at the sensor and then use shielding on the cable from the sensor to prevent interference from electric fields but not magnetic fields. 10bits of resolution on a 5V signal equates to 5mV, which isn't terribly difficult in most cases but in your environment with AC running close to the cable keeping the signal clean might be tricky.
The other problem with a cable would be currents (especially non constant/AC currents) on the shield of the cable which must be kept to a minimum or the current through mutual inductance between the shield and the inner conductor(s) can create voltage noise on the inner conductors. There can be problems with running a long cable, such as creating a ground loop. Since you also need to go through a slip ring this could also be a potential source of noise since it would difficult to maintain the shield. The slip ring could be a big source of noise for an analog signal.
With a digital signal (and differential signaling such as RS485) these problems can be avoided and in my opinion it is easier to isolate a digital signal than an analog one. I would say go the digital route considering your environment.
If you want to go the analog route, if you had a power supply, you could run an experiment and set it to a known voltage, like 5V and then measure the noise on the other end of the cable with a DMM, if the noise is below 5mV, then it would probably be feasible to go the analog route.
No comments:
Post a Comment