Wednesday, 11 October 2017

Why is it a bad idea to divide reference voltage only with resistors instead of using an op-amp buffer?



I'm planning on shifting the output of an AD8226 instrumentation amplifier by 'about' 0.5 volts using a 2.5 v precision reference and a voltage divider. According to the datasheet this is a bad idea and a buffer must be used:


enter image description here


I understand that if I don't use a buffer I'll probably end up with a slightly different voltage shift and a slight increase in gain. I'm guessing these can be calibrated out by software without a problem. If yes, then why should I use a buffer? Also, there is a note at the last line about a degradation in CMRR. I'm using the device for reading a 0-10 v (very slowly changing) signal in an industrial environment (single ended). Will the CMRR degradation be significant?


P.S. Voltage divider used has two resistors 20k and 4.7k.




No comments:

Post a Comment

arduino - Can I use TI's cc2541 BLE as micro controller to perform operations/ processing instead of ATmega328P AU to save cost?

I am using arduino pro mini (which contains Atmega328p AU ) along with cc2541(HM-10) to process and transfer data over BLE to smartphone. I...