Let's call the load an LED (could as easily be a resistor)...
14.5v (Lead Acid Battery)
1 LED (or anything, let's say 100ma for argument sake)
----(14.5-2.4v)/0.100amp =121ohm resistor or 12.1v*0.1amp = 1.21 watts
A resistor of 121ohms and rated at say 2 watts connected to a 14.5v battery to light up a 100ma led...
So is the resistor going to heat up simply because of the current traveling through the resistor? or is the heat generated because it prevents more electrons flowing through when a load is applied? because without a load, the circuit would show 14.5v with no load and with a load (say an LED) the voltage would read in the 2.4-4.2v depending on the bandgap and the LED specs..
So why does the voltage drop when i measure it from the multimeter? should it not read 14.5v but only deliver 100ma to the load?....
14.5v (Lead Acid Battery)
1 LED (or anything, let's say 100ma for argument sake)
----(14.5-2.4v)/0.100amp =121ohm resistor or 12.1v*0.1amp = 1.21 watts
A resistor of 121ohms and rated at say 2 watts connected to a 14.5v battery to light up a 100ma led...
So is the resistor going to heat up simply because of the current traveling through the resistor? or is the heat generated because it prevents more electrons flowing through when a load is applied? because without a load, the circuit would show 14.5v with no load and with a load (say an LED) the voltage would read in the 2.4-4.2v depending on the bandgap and the LED specs..
So why does the voltage drop when i measure it from the multimeter? should it not read 14.5v but only deliver 100ma to the load?....