Historically when speaking about lets say an analog low-pass filter, the cut-off frequency of that filter is defined as -3dB.
So this is the frequency where the output sinusoid signal's amplitude attenuates to 0.7 of the input(Voltage-ratio). When squared this ratio we obtain %50 which is the Power-ratio.
It is very obvious that once upon a time it was decided the cut off-frequency should be the frequency which causes a %50 power loss of a particular frequency sinusoidal signal input.
My question is: what could be practical reason defining it %50? Isn't %50 still big amount and how could it be associated with being filter. I could make sense if it were chosen %95 for example.
Answer
First, calling it the "cutoff frequency" leads to misconceptions. "Rolloff frequency" is a better name that gives you a more accurate mental picture of what is really happening.
Using the -3 dB point is not arbitrary. It falls out from the math naturally. For a R-C filter:
ω = 1 / RC
where ω is the frequency in radians/second, R is in Ohms, and C in Farads. For the frequency in Hz, use:
f = 1 / 2\$\pi\$RC
If you plot the Log(amplitude) as a function of Log(frequency), such as in a Bode plot, then the -3 dB frequency is where the asymptotes for the pass band and stop band meet. Put another way, at frequencies well into the pass band, the filter looks like a horizontal line. At frequencies well into the stop band, the filter is a line with a slope of 20 dB per decade (+ or - depending on high or low pass). If you draw those two lines and extend them to where they meet, it will be at the -3 dB rolloff frequency.
No comments:
Post a Comment