What causes a resistor to heat up?

Discussion in 'Power Electronics' started by Raidzor, Oct 25, 2017.

1. Raidzor Thread Starter New Member

Oct 17, 2017
8
0
So let's say I create a circuit. I use a 5V battery that can supply up to 2A, a 10 ohm resistor and a 5V 500 mA LED. Theoretically the resistor limits the current to 500mA, so my LED should light up at full brightness. But why does the resistor heat up after a bit, even if the total amount of current flowing through is 500mA? How could I reduce the heat on the resistor?
I know its a very basic question, but I really can't figure it out.

2. ian field AAC Fanatic!

Oct 27, 2012
6,543
1,198
Electrons in multiple collisions.

3. Uilnaydar AAC Fanatic!

Jan 30, 2008
118
39
If you have 500mA going through a 10 ohm resistor: P = I^2*R = .5^2*10 = MUCH more than a 1/4W resistor...

4. Papabravo Expert

Feb 24, 2006
12,279
2,723
Resistors heat up because the power going through them is converted to heat that must be dissipated via radiation, conduction or convection.

5. Raidzor Thread Starter New Member

Oct 17, 2017
8
0
Yes but wouldn't that be the case with ONLY the resistor connected? Because total draw is 500mA and the LED theoretically "consumes" all of that

6. AlbertHall AAC Fanatic!

Jun 4, 2014
8,149
2,019
If the resistor and the LED are connected in series then whatever current flows, the same current MUST flow in both the resistor and the LED. For a 5V supply feeding a 10Ω resistor and an LED this current will be less than 500mA. Exactly what that current is will depend on the characteristics of the LED.

7. GopherT AAC Fanatic!

Nov 23, 2012
7,983
6,786
No, the LED doesn’t consume all of the energy.

The voltage drops 2 to 3 volts across an LED (diode). The LED only consumes the power equal to voltage drop(volts) x current (amps) = power (watts)

If you put more then the design voltage across the LED, excessive current will flow and the LEd will burn out.

8. Raidzor Thread Starter New Member

Oct 17, 2017
8
0
So, if i connected ONLY the resistor in the circuit, the heat dissipated would remain the same?
I feel so dumb right now but I still can't get it.

9. AlbertHall AAC Fanatic!

Jun 4, 2014
8,149
2,019
No. With only the resistor the current would be 500mA and the power dissipated in the resistor would be 2.5W. If the LED is connected in series with the resistor then the current will be less than 500mA and the power dissipated in the resistor will be less than 2.5W.

Let's suppose that the voltage across the LED is 3V, then there would be 2V across the resistor (making the total supply of 5V). With 2V across the resistor the current would be 200mA through both the resistor and the LED. The dissipation in the resistor would then be 400mW.

10. Raidzor Thread Starter New Member

Oct 17, 2017
8
0
Ooh ok I understand now. So the Voltage is distributed across the loads. Is there a way to calculate the Voltage on each load or I have to directly measure it?

Jun 4, 2014
8,149
2,019
12. GopherT AAC Fanatic!

Nov 23, 2012
7,983
6,786
Measuring is an easy way to confirm a calculation.

13. BR-549 AAC Fanatic!

Sep 22, 2013
4,936
1,348
Surface area. That's the reason it's hot. Replace the resistor with a physically larger resistor and the heat will go down. The power dissipation won't, but the temp will.

Put a smaller resistor in......it will get real hot.