A data-acquisition amplifier has different ranges. The board amplifiers a voltage sample and ADC converts it to a digital code. There is probably a reference voltage inside the board so it compares with it and produces a code.
But how can I figure out in which range the gain is 1 or more? For example if I set the range to +/-10V does that mean the gain is 1? Whats the relation between the gain and the range and why? Lets say I set the range to 1V. Will the gain increase in this case comparing to 10V range? I couldn't figure out the logic here.
Here is the board: http://www.mccdaq.com/PDFs/manuals/PCI-DAS6023-25.pdf
Answer
It seems to be a 12 bit device used for conversion so, zero to 4095 digital numbers resresent the range from -10V to +10V.
Gain is irrelevant as a concept - if you want more sensitivity such as a convertible range of -50 mV to +50 mV then choose that setting and work with a the same digital number range redefined as the new analogue range of -50 mV to + 50mV.
One thing to watch... your digital number that respresents the analogue value may be 11 bits with the MSB indicating the polarity (- or +) of the signal. This is quite normal but some ADCs use a different numbering system. read the manual!
ADDITION
There may be gain or there may be attenuation. Consider this: -
If the ADC (at the heart of the circuit) has full scales of +10 V and -10 V then for the +/-50 mV range there has to be a gain of 200 between input signal and ADC. On the other hand (less likely of course), if the ADC's inherent full-scale range is +50 mV and -50 mV then there has to be an attenuation of 200:1 to cope with the +/-10 V range.
No comments:
Post a Comment