I am trying to design a voltage-controlled current source supplying 0 -> +/-2Adc into a 20Ohm, 100mH load from a +/-100Vdc supply. I need at least 3kHz bandwidth for small signals, but my current circuit lags well below this point and I am struggling to find out why. Here is the circuit : Note that the opamp is supplied with +/-120V even though it is impossible - I need to get that circuit to work before getting to how to drive the gates of the transistors.
Transients for 1kHz control (Green: output of MOSFETs, light blue: output of opamp, dark blue: sense, red: control): I do not know why the voltage is so slow to decrease. Where could it come from?
Edit: I've tried with a LT1226 for example and it works very well (10kHz bandwidth)... Why?
My second problem is how to drive the gates of the transistors (which, looking at the simulation, require a higher voltage than Vds): a) if I buy lab supplies to supply rails higher than the +/-100V, or b) if I make do with the lab supplies I have. Any suggestions are welcome...
Edit: Dissipation, cost and space is not design drivers, but speed and stability are.
No comments:
Post a Comment