rule of thumb wattage sizing

Thread Starter


Joined Apr 6, 2009
What do you guys use as a rule of thumb for safely sizing for wattage?

My resistor will see .5 watts. Calculated like this...
9V- 3.6Vf (LEDS in series) = 5.4V (LEDS are 1.2Vf, 100 mA)
5.4V/ 58 ohms = .093A
.093^2 x 58 = .501 watts.

What size resistor should I use? 1 WATT?


Joined Sep 20, 2005
Yes I would use 1W rated resistor. Or a switching led driver so that I don´t waste so much energy on a dropping resistor.


Joined Feb 11, 2008
My industry (telecom)/some design standards require calculated wattage x 3
Standard have relaxed a lot! We were taught wattage *10, and they were very serious about it. But in those days they expected stuff to last indefinitely, even in a hot industrial environment.

I've seen "5W" ceramic resistors get hot enough to melt the solder on their legs and fall out of circuit, running at 2W continuous in a non-ventilated enclosure.

Please note too that the "5W" resistors you buy these days are only about 60% of the volume and mass of the "5W" resistors we had in the 1970's.

Also keep in mind it makes a LOT of difference if the resistor is mounted ON the PCB or half an inch above it, and if there is space for convection airflow or other parts tight around it.

Testing is very important. If a resistor gets hot enough to burn your finger it's too small.


Joined May 13, 2013
You would also want to consider the power supply drop.
Especially if you are using a 9v battery with 100 ma the voltage will probably drop to 7,5 or so and your leds will be getting less current and you resistor will be dissipating less but I guess it's much safer to go with a higher wattage rating anyway.

*even power adapters drop a significant amount of voltage when under load


Joined Apr 24, 2011
First thing you need do is forget rules of thumb after the one that says "always RTFM" if you're doing anything beyond something that sits in your air conditioned living room.

Here's a "typical" derating curve for a 5 W resistor. (I call it typical because it's the very first one I found.)

These are very high temperature resistors capable of operation up to 250 degrees. However, do note that their wattage drops off above 70 degrees.



Joined Feb 11, 2008
That derating curve is a joke!

It says you can run a 5W resistor at 5W dissipation, in an ambient temperature of 80'C.

No way in hell, unless you crimp its leads.


Joined Nov 3, 2012
My 2 cents.

I recently repaired a current production linear power supply rated to supply 11 amps. This unit was well-designed with forced air cooling through a cooling tunnel.

It had 4 0.22 ohm resistors in parallel in pairs with two power transistors to share the 11 amps. Problem was, 2 were open (nothing else wrong)

Each resistor dissipated 11a/4 = 2.75a^2*0.22 = 1.66 watts.
The resistors in the unit were quality ceramics rated at 4 watts.
Physically, they were about 1/2 the size of sand 5w WW resistors.
So, even though they were rated at 4 watts, they got quite warm-hot at load.
Replaced all with the much larger 5 watts, much cooler, no issues.

My experience is to go conservative with the ratings on power resistors, and size does matter.
As RB said, I would use the 'ouch' method as the final test.


Joined Feb 11, 2008
Try reading it again. It does not say what you say it says.

It says the 5W rating holds until 70'C. ...
The X scale on the chart places that top knee at 80'C. As for the funny little -70 label at the chart top I have no idea what that means.

And you are missing my point, a modern sized 5W resistor dissipating 5W will get hot enough even at 25'C ambient to melt the solder connecting it to a PCB and the resistor will fail.

At 80'C ambient you would be lucky to run 1.5W-2W in a modern 5W resistor before the solder melts and fails.

That's why I mentioned the need to crimp the leads, which to me anyway turns it from a "resistor" into a "heating element"!

In the "hobby LED" context of post #1 it would suggest a "resistor" is one that can be soldered onto a LED or soldered into a PCB.