Referring to the following schematic:
My current understanding dictates that a transistor will output a certain drain current given an input voltage at the gate (V1 and V2). How can this behavior stand true in the schematic shown, since there will be two "competing" current sources? Which transistor sets the current of the circuit?
Answer
Which transistor sets the current of the circuit?
The transistor which tries to make the lowest current.
For a transistor to determine its drain current it must be in saturation mode. For this to happen, Vds > Vgs - Vt so there must be enough Vds and not too much Vgs. If Vgs is too large, the transistor will be in linear mode and behave as a resistor.
Suppose that both NMOS and PMOS have enough Vds to be in saturation, for example when Vout = Vdd/2. Suppose the NMOS wants to make 100 uA flow but the PMOS wants 200 uA to flow. Which one will win?
So there's 200 uA pulling "up" and 100 uA pulling "down" then what does the voltage on Vout do? It will go up as 200 uA pulling up - 100 uA pulling down leaves a net result of 100 uA pulling up.
So the voltage on Vout will go up. What does this mean for the transistors? For the NMOS this is good news, Vds will increase so it can continue to make 100 uA flow. No problem!
For the PMOS things are different, as Vout goes up its Vds will decrease to the point where the PMOS will go out of saturation mode and enter linear mode. The PMOS will have no control over the current. It wants to make 200 uA flow but the NMOS prevents that by taking all the voltage. So the NMOS wins since it wanted the lowest current.
Also if you switch a transistor off by making Vgs = 0 then that transistor will win as there's nothing the other transistor can do to make any current flow.
Realize that it is easy to make less current flow (just drop voltage) but it is impossible to make more current flow than the current a transistor gets from elsewhere.
No comments:
Post a Comment