I'm planning on shifting the output of an AD8226 instrumentation amplifier by 'about' 0.5 volts using a 2.5 v precision reference and a voltage divider. According to the datasheet this is a bad idea and a buffer must be used:
I understand that if I don't use a buffer I'll probably end up with a slightly different voltage shift and a slight increase in gain. I'm guessing these can be calibrated out by software without a problem. If yes, then why should I use a buffer? Also, there is a note at the last line about a degradation in CMRR. I'm using the device for reading a 0-10 v (very slowly changing) signal in an industrial environment (single ended). Will the CMRR degradation be significant?
P.S. Voltage divider used has two resistors 20k and 4.7k.
No comments:
Post a Comment