Normally rms value and the mean value of a perfect DC voltage is same.
So lets say I want to measure the gain of an amplifier.
I first apply an offset Vin_off to the inputs, then I measure Vout_off.
Then I apply Vin to the inputs, then I measure Vout.
So I calculate the gain as:
\$G_{dB} = 20 \ \log_{10}([Vo_{mean} - Vo_{off-mean}]\ / [Vi_{mean}-Vi_{off-mean}])\$
Basically I measure the gain by calculating the ratio of the change in output voltage to the change on input voltage.
But as you see I use mean values..
What if the input and output signals also have some noise on them?
Should the mean values or instead rms values be used as:?
\$G_{dB} = 20 \ \log_{10}([Vo_{rms}-Vo_{off-rms}]/[Vi_{rms}-Vi_{off-rms}])\$
No comments:
Post a Comment