Using transistors in with limited gate (or base) voltage will make them limit current, which will introduce a significant voltage drop across the transistor, causing it to dissipate energy. This is considered bad, wasting energy and shortening the life of the component. But if I keep the temperature low, either with a heat sink or by limiting the power, is it okay to use a MOSFET this way? Or is it fundamentally bad for the component to make it dissipate power?
I ask because I get excellent results by controlling a MOSFET with variable voltage to drive an LED strip. With 8-bit PWM, the LED jumps in brightness from zero to "reading a book" levels, while the voltage-driven mosfet allows very smooth turn on, despite also using 8-bits of voltage levels. Linear versus exponential power makes all the difference, and PWM is linear. Our eyes don't perceive light linearly. The voltage-controlled result is too good to not use.
Addendum: I have done extensive experimentation with PWM, including adjusting the prescalers. Changing the PWM duty is not an effective solution, though if someone wants to donate an oscilloscope, I might be able to make it work :)
Addendum: The project is a lighting up alarm clock, like these Philips products, but more carefully tuned. It is imperative that the gradation between the low power levels be miniscule. The brightest acceptable low-power state is around 0.002%, and the next is 0.004%. If it's an x/y problem to ask about the solution rather than problem, then this is an intentional x/y question: I've found my preferred solution after extensive testing, and I want to know if my solution is workable. The device is currently working with a less preferred workaround involving a much dimmer auxiliary light.
Addendum 3: I gather this is what BJT transistors are used for. Since they're current-controlled, the circuit is much harder. I need to look into that when I have time to draw diagrams. I'll post another question if I have trouble.
No comments:
Post a Comment