I'm trying to design a circuit that will let me drive an LED or Diode laser with a RF signal up to ~20 MHz or so. Since the optical device will need to be biased with a DC current, a bias -t is a natural choice. I was planning on using a GALI-84+ amplifier from Minicircuits to amplify my RF signal and then pass it into a bias-T circuit, however when I was looking at the schematic for the eval board for the GALI-84+, I noticed that the output of the amplifier is biased using an inductor and a DC supply, before it is AC coupled using a capacitor to the output.
I contact Minicircuits asking if I could remove the coupling capacitor and use this circuit to both bias the diode and amplify my signal at once, but they said:
The GALI-84+ is an RF amplifier designed to drive 50 Ohm loads. To drive an LED, you will need to configure the amplifier conventionally, detect the RF output signal using a simple diode detector and use it to turn on a transistor with the LED in the collector.
They described the diode detector as, "a series small signal Shottky diode feeding a 1K resistor to ground", and then connecting the Shottky to the base of a transistor.
So, now I'm confused:
1) Why use a diode detector? 2) Will I harm something if I use the GALI to drive a non-50ohm load directly?
Update Is there any reason I can't use this configuration to drive my LED? Since the amplifier seems to be a Darlington pair, wouldn't this be appropriate for driving a variable current? The non-handdrawn portion is the Gali-84+ schematic from the datasheet.
Update A few details on the specifics. I need to dive a diode laser, with up to 200 mA of current. The laser turns on at about 4v, with 40 mA of current, and will be driving about 200 mA by 5.5v. I need to bias the laser on (above the cut-on voltage) and drive it with an AC signal. So, bias at 4.75v and driving with a +/- 100 mA current, or +/- 0.5v. The diode is extracted from BlueRay DVD players, and as such, I don't have detailed specifics on its behavior.
Thanks.
No comments:
Post a Comment