Resistor heat.

Thread Starter

cjdelphi

Joined Mar 26, 2009
272
Resistors seem a little counter intuitive to me but I understand how they work fine.

For example.

10vdc you wish to supply 100ma to the circuit, 10v/0.100= 100 ohms pretty straight forward, heat dissipation becomes 10v * 0.100 = 1watt you'd need a 1watt resistor or you'd risk burning it up.

But I don't see how that's the "resistor" resisting the voltage by turning the excess into heat, if you put 100ma through a wire, or anything conductive it's going to use create just as much amount of heat

So let's suppose that 100ma flowing through the circuit, if you stick a 1watt component rated at 10v into the equation, the resistor then should not produce any heat because it's no longer having to deal with 1watt of energy? I mean sticking a 10k resistor and next to no heat is produced and no way is that 10k resistor turning it into heat it would be cold as ice and supplying a little amount of energy....

(Resistors don't turn excess into heat unless nothing is there to eat up the energy?)
 

beenthere

Joined Apr 20, 2004
15,819
The quantity of heat dissipated in any resistance is equal to the square of the current times the resistance (P = I^2*R). That means that one amp through a wire with perhaps 1 milliohm resistance will not dissipate as much heat as the one amp through a 100 ohm resistor - .001 watt in the wire and 100 watts in the resistor.

If any circuit has current, then it dissipates heat.
 

italo

Joined Nov 20, 2005
205
power is heat power is the product of V*I increase one or the other increases heat. it does not matter what component is involved. and finaly running 1w into a 1w resistor is a bad idea because it will not fail but it will get hot a causing next components to get hot. my rule 1w dissipation use 2w or more.
 

Audioguru

Joined Dec 20, 2007
11,248
10vdc you wish to supply 100ma to the circuit, 10v/0.100= 100 ohms pretty straight forward, heat dissipation becomes 10v * 0.100 = 1watt you'd need a 1watt resistor or you'd risk burning it up.
No.
If you have 10V and wish to supply 10V at 100mA to a circuit then do not use a 100 ohm resistor or the circuit will receive reduced voltage and reduced current.
Your example uses the 100 ohm resistor as the load so it has 10V across it and has 100mA through it. it heats with 10V x 100mA= 1W.
10V in series with a 100 ohm resistor will deliver only 5V at only 50mA to a 100 ohm load.

if you put 100ma through a wire, or anything conductive it's going to use create just as much amount of heat
No.
Heat is caused by the voltage across a resistor times the current through it. A resistor has a voltage across it and a current through it and it gets hot. A wire has a current through it but has almost no voltage across it so it does not get warm.
 

Ratch

Joined Mar 20, 2007
1,070
cjdelphi,

But I don't see how that's the "resistor" resisting the voltage by turning the excess into heat, if you put 100ma through a wire, or anything conductive it's going to use create just as much amount of heat
That is because you don't understand the relationship of voltage, energy, and resistance. What do you think voltage is? Voltage is the energy density of the charge. Its MKS units are joules/coulomb. If you want to pass a quantity of charge through a resistor, it is going to take energy. If you want to sustain an amperage through a resistor, it is going to take power. If you measure 5 volts across a resistor, that means that the energy density of the charge decreased by 5 joules/coulomb. If you want to know the energy required to push that charge through a resistor, multiply 5 joules/coulomb by the amount of charge. If you want to know the power to sustain a amperage, multiply 5 joules/coulomb by the amps present in the resistor.

So a resistor operates by reducing the energy density (voltage) of the charge by dissipating the charges"s energy as heat. This causes the energy density (voltage) of the charge to lower after passing through the resistor.

Ratch
 
Top