How can I calculate minimum readable rise/fall times of a signal in an oscilloscope (from practical point of view)? Does it depend just to bandwidth and sampling rate of oscilloscope? regards
Answer
Did you Google at all before asking? A simple search for "scope bandwidth vs rise time" brings up this document, which explains that the "rule of thumb" is that
$$\text{Rise Time} = \frac{0.35}{\text{Bandwidth}}$$
For example, a 350 MHz scope will have a rise time of about 1 ns.
For reasonably accurate readings, the scope rise time should be no more than about 20% of the signal rise time. Another way of putting it is that the scope bandwidth should be 5× the highest frequency of interest.
Note that we're talking about the analog bandwidth here. The sample rate really has nothing to do with it, as long as it meets the Nyquist criterion for the bandwidth.
No comments:
Post a Comment