Suppose I have a 100 mAh battery at 20V. I connect a 1000 kohm resistor across it. How much heat will be generated and how can I find the temperature rise in the resistor? As the battery operates I think that the current flow will reduce over time but am not sure about the voltage for a real battery. Perhaps I am not giving sufficient information here, I am sorry for that.
I just wish to know, what information is needed to make such calculation? Have you ever done it? In the ideal case (taking only the most significant factors into consideration) what factors are considered to make an estimate of the heat dissipation and temperature rise and why would the real heat dissipation and temperature in the actual practical experiment be different?
I know this question looks hard, but I will be very happy if I can finally have this mystery resolved.
Answer
The power delivered to a resistor, all of which it converts to heat, is the voltage accross it times the current thru it:
P = IV
Where P is power, I is current, and V is voltage. The current thru a resistor is related to the voltage accross it and the resistance:
I = V/R
where R is the resistance. With this additional relation, you can rearrange the above equations to make power as a direct function of voltage or current:
P = V2/R
P = I2R
It so happens that if you stick to units of Volts, Amps, Watts, and Ohms, no additional conversion constants are required.
In your case you have 20 V accross a 1 kΩ resistor:
(20 V)2/(1 kΩ) = 400 mW
That's how much power the resistor will be dissipating.
The first step to dealing with this is to make sure the resistor is rated for that much power in the first place. Obviously, a "¼ Watt" resistor won't do. The next common size is is "½ Watt", which can take that power in theory with all appropriate conditions met. Read the datasheet carefully to see under what conditions your ½ Watt resistor can actually dissipate a ½ Watt. It might specify that ambient has to be 20 °C or less with a certain amount of ventillation. If this resistor is on a board that is in a box with something else that dissipates power, like a power supply, the ambient temperature could be significantly more than 20 °C. In that case, the "½ Watt" resistor can't really handle ½ Watt, unless perhaps there is air from a fan actively blowing accross its top.
To know how much the resistor's temperature will rise above ambient you will need one more figure, which is the thermal resistance of the resistor to ambient. This will be roughly the same for the same package types, but the true answer is available only from the resistor datasheet.
Let's say just to pick a number (out of thin air, I didn't look anything up, example only) that the resistor with suitable copper pads has a thermal resistance of 200 °C/W. The resistor is dissipating 400 mW, so its temperature rise will be about (400 mW)(200 °C/W) = 80 °C. If it's on a open board on your desk, you can probably figure 25 °C maximum ambient, so the resistor could get to 105 °C. Note that's hot enough to boil water, but most resistors will be fine at this temperature. Just keep your finger away. If this is on a board in a box with a power supply that raises the temperature in the box 30 °C from ambient, then the resistor temp could reach (25 °C) + (30 °C) + (80 °C) = 135 °C. Is that OK? Don't ask me, check the datasheet.
No comments:
Post a Comment