Oscilloscopes let us measure input voltage in a variety of input signal ranges (either AC/DC, +/- 500V or Gigahertz's of frequency).
I am wondering: how they make it happen?
Say, an oscilloscope has a microprocessor/microcontroller/ADC unit at its heart, which usually works with 0..5 V
DC voltage. The input signal could be not DC, but AC also and may have huge peak values, in comparison to the 0..5 V range.
There should be an input cascade, which downscales this voltage to a reasonable range (for example, +/- 2.5 V
) and offsets it to be in the working range of ADC/MPU/MCU (0..5 V
). It is also possible to have a slightly narrower range to allow for error/peak values, say 1.25 .. 3.75 V
.
There are three ways I know, which allow for a downscale and offset of an input signal:
- potential transformer (probably won't work for DC)
- resistor-based voltage divider
- operational amplifier configured to have fractional gain (in range
(0..1)
)
A potential transformer won't work for DC.
Resistor ladder / voltage divider will work, but will be very sensitive to the overall device resistance:
simulate this circuit – Schematic created using CircuitLab
From what I know, the rest of a device may have variable resistance, and thus the calculation of resistor values won't be reliable (provided that Rdevice
may vary).
The opamp input cascade might do the trick, but the IC should have very specific characteristics, since the input voltage offset, temperature drift, slew rate and bandwidth might significantly distort the signal.
And I am not quite sure whether providing high voltages or AC voltage will work fine with opamp.
My question is: is there an effective way of downscaling the input signal of either AC/DC, high frequency (up to GHz range) or high peak values (up to 400 V
)? Or how it is implemented in oscilloscopes?
Answer
This question belongs to extensive area of engineering of data acquisition systems and instrumentation. The signal scaling is usually done in two steps: probe attenuator, and scope amplifier.
To begin, there is no such thing as "500V" and "GHz", this kind of signal will likely fry everything around and can be found only in high-powered RF antennas, where different methods are used to observe signals.
So you have mostly either "400 V" and few hundred kHz, or GigaHerz at 1-3-5V scale.
The first problem is solved by scaling the signal using 1:10X or 1:100X passive voltage oscilloscope probes, which are essentially capacitance-compensated voltage dividers. To get an impression, look at this article.
The second type of pre-scaling uses "active probes", which are a combination of ultra-fast operational amplifiers with fine tuned coaxial transmission lines.
In both cases the signal is brought roughly to +-5V range. The rest of scaling is done inside an oscilloscope using variable gain wide-band amplifiers frequently referred as "front-end", see example here.
So the whole thing is a bit more complicated than an ordinary operational amplifier.
No comments:
Post a Comment