# Use 1/2W resistors in place of 1W?

Discussion in 'General Electronics Chat' started by Mossen, May 30, 2011.

1. ### Mossen Thread Starter New Member

Dec 21, 2010
16
0
Hello

I'm building a circuit that calls for a 1W resistor. But I only have 1/2 W available. Can I use 2 of those to do the job as long as the resistance calculates to the same value?

Mossen

2. ### tom66 Senior Member

May 9, 2009
2,613
214
Yes. Resistors in series add, resistors in parallel (for the same value) divide the resistance by the number of resistors.

3. ### k7elp60 Senior Member

Nov 4, 2008
478
69
To add what Tom66 said. The wattage always increases when resistors are connected in series or parallel. All you have to do is add the wattage together.

4. ### Mossen Thread Starter New Member

Dec 21, 2010
16
0
Just for discussion's sake, is this true even when the resistors used are not the same value? In the example above, if one of them is dropping a much larger voltage, wouldn't it also be dissipating more than its rated wattage?

5. ### beenthere Retired Moderator

Apr 20, 2004
15,815
283
Power dissipated by a resistor is the product of the voltage across it and the current through it. If you have resistors in parallel, one resistor can't have a larger voltage drop than the other, only a greater current depending on its value.

6. ### #12 Expert

Nov 30, 2010
16,665
7,313
A one watt resistor is capable of dissipating one watt of heat without being damaged, if the external air is about room temperature, nearby parts dont heat it by conduction, things like that.

7. ### Jaguarjoe Active Member

Apr 7, 2010
770
90
It is always good to derate a resistor's wattage by at least a factor of 2. 2-1/2 is even better.
A 1 watt resistor dissipating 1 watt will get very warm. For some it can get up to 200 degrees F.

8. ### Kingsparks Member

May 17, 2011
118
5
To the question of one resistor running hotter then the other if the values are different.

Take a 12V supply, two resistors in parallel, a 10 ohm and a 100 ohm.
They both drop 12V. The 10 ohm draws 1200mA or 1.2A. The 100 ohm draws 120mA or .12A.

I*V=Power so the 10 ohm is dissipating; 12*1.2=14.4W The 100 ohm is dissipating; 12*.12=1.44W

So, yes, the value makes a difference, depends on the resistor as to if it exceeds the rating.