Droppingfrom 44v voltage for LED

Thread Starter

brightnight1

Joined Jan 13, 2018
91
I have a 44v battery and I want to run a low power consuming 12V LED on an On and off switch. Is there any problem with dropping the 32V over a resistor to power the LED? Thanks!
 

thedoc8

Joined Nov 28, 2012
162
You have one led that you want tor run off your 44 volt battery...correct..? A 2.2k or there about would work and I don't see any problems
 

Thread Starter

brightnight1

Joined Jan 13, 2018
91
You have one led that you want tor run off your 44 volt battery...correct..? A 2.2k or there about would work and I don't see any problems

Correct.
Can I use a normal 1/4 watt resistor? Will it get hot? Will it be dissipating/wasting power even when the LED is off?
 

Tonyr1084

Joined Sep 24, 2015
7,899
You could give us the numbers and let us work out the math OR you could work out the math yourself. LED's aren't rated in voltages they're rated in current. Since we don't know anything about the LED you currently have we really can't answer your question. But you can still figure it out on your own.

Start by testing the LED with a low voltage and an assumed current of 10 mA (milli-amps, or 0.010 amps). Since older LED's typically operate at 10 to 20 mA, lets start with 5 volts for testing purposes. Most LED's have a forward voltage rating, depending on their type, and this is the critical piece of information you need. Start by assuming a forward voltage of 3 volts. Modern LED's are close to that. Older ones (hard to find these days) may have lower forward voltages, and we'll get to that part in a bit.

Assume you use a 5 volt source and an LED with a 3 forward volt rating. Subtract that voltage from the source voltage. In this case, 3 volts taken away from 5 volts leaves you with 2 volts remaining. You now need to calculate the necessary resistance that will provide 10 mA. Voltage divided by amperage (not milli-amps) will give you the necessary resistance. Same is true when you divide voltage by the resistance, you get the amperage. Rather than guess till you find a winning combination, just divide 5 volts by 0.01 amps and you get 500. 500 ohms. So using a 500 ohm resistor and a 5 volt source you can safely light up your LED and measure its ACTUAL forward voltage by measuring voltage directly across the leads of the LED. You may find 2.9 volts or possibly 3.3 volts, or some other voltage, depending on the actual LED. Red LED's typically (I said "TYPICALLY") have low forward voltages around 2.2 volts, again, depending on the manufacturer. Green, blue and white are typically above 3 volts.

Now that you know the forward voltage you can calculate the necessary resistor to power it from a 44 volt battery. Lets assume you discovered 2.9 volts forward across your LED. 44 volts minus 2.9 volts leaves 41.1 volts. At 10 mA your LED should be bright enough. If not you can likely go as high as 20 mA, but generally not higher than that. Lets shoot in the middle, 15 mA. 44 volts minus 2.9 volts equals 41.1 volts. 41.1 volts divided by 0.015 amps equals 2740 ohms. In this example a 2740 ohm resistor will safely light your LED without burning it out. But lets not forget to calculate for wattage either.

Wattage is the product of multiplying voltage times current. In this example you can use the reduced voltage, but using the full voltage supplied would give you a safer margin. So, 44 X 0.015 = 0.66 watts (660 mW). That's larger than a half watt (half watt is 500 mW). For safety reasons you'd generally prefer to use a margin of around twice the needed wattage. Since they don't make 1.3 watt resistors (that I'm aware of) you can probably use a 1 watt resistor safely enough. But if you find the resistor is getting too hot you can go with a 2 watt resistor.

So to recap:

Source minus the forward voltage
Reduced voltage divided by the desired current (typically 10 to 15 mA) to get the resistance.
Multiply the total voltage times the current to figure out the wattage, then use approximately double the wattage. Good general practice as a minimum is 1.5 times the needed size. In my example a 660 mW expected wattage was calculated. A 1.5 times calculation would give you a 0.99 watt (990 mW) resistor. I think in this example you can get away with a 1 watt resistor. Others here will disagree and say you should go with a 2 watt resistor. There's no problem in doing that. Except in case you don't have the room for a larger resistor.

Ohms law states that voltage is equal to resistance times current. Conversely, voltage divided by resistance will give you current and voltage divided by current will give you the resistance. It also states that wattage is calculated by multiplying the voltage times the current.

Good luck. Now you should be able to calculate some factors. If you have any further questions - be sure to ask. Someone here can definitely answer your question. Don't be shy.
 
Top