# Voltage Drop and Wire Capacity in Low Voltage Lighting

#### lsllll

Joined May 31, 2017
5
Hi. First post here, so bear with me. I am trying to convert my line level (120 VAC) voltage landscape lights to low voltage LED lighting. Based on what I've seen leaving the circuit panel in my basement, I have 6x12 gauge wires feeding about 35 x 50W halogen lights in my yard. My guess is that the cables split into different areas of the yard, but I still have to follow the cables to figure that out. The lights I'm using can take a variable DC voltage in the range of 10-32 VDC. Here is what I'm using:

https://www.amazon.com/gp/product/B01FTIKM5K/ref=oh_aui_search_detailpage?ie=UTF8&psc=1

I have measured the current to be around 1.25 A for these lights.

The question I have has to do with voltage drop and wire capacity to carry the amperage for the lights hanging off it. Let's not worry about the source voltage for now and assume I can come up with something around 24VDC (2 car batteries, charged during the day, or something similar). Obviously the lights will all be connected in parallel to the source. Assuming that my longest run is 200 feet and that I'm using 3 wires for positive and 3 for negative, my voltage drop at the end, after 35 lights (35 x 1.25 A~=45 A) would be down by 9.5V, which is within acceptable range for me. Here's the calculator I used:

http://www.calculator.net/voltage-d...ce=200&distanceunit=feet&amperes=45&x=60&y=19

However, not all my load is at the end of the wire. I have 35 taps into the wire at different distances. So, obviously my voltage drop will be less at all lights. The question is, how do I calculate the true voltage drop so that I can determine how to size my power supply? And, in case my 6 wires split off in different directions once they leave the basement panel, will they have the capacity to handle the load?

Thanks,
Ross.

#### GopherT

Joined Nov 23, 2012
8,012
Hi. First post here, so bear with me. I am trying to convert my line level (120 VAC) voltage landscape lights to low voltage LED lighting. Based on what I've seen leaving the circuit panel in my basement, I have 6x12 gauge wires feeding about 35 x 50W halogen lights in my yard. My guess is that the cables split into different areas of the yard, but I still have to follow the cables to figure that out. The lights I'm using can take a variable DC voltage in the range of 10-32 VDC. Here is what I'm using:

https://www.amazon.com/gp/product/B01FTIKM5K/ref=oh_aui_search_detailpage?ie=UTF8&psc=1

I have measured the current to be around 1.25 A for these lights.

The question I have has to do with voltage drop and wire capacity to carry the amperage for the lights hanging off it. Let's not worry about the source voltage for now and assume I can come up with something around 24VDC (2 car batteries, charged during the day, or something similar). Obviously the lights will all be connected in parallel to the source. Assuming that my longest run is 200 feet and that I'm using 3 wires for positive and 3 for negative, my voltage drop at the end, after 35 lights (35 x 1.25 A~=45 A) would be down by 9.5V, which is within acceptable range for me. Here's the calculator I used:

http://www.calculator.net/voltage-d...ce=200&distanceunit=feet&amperes=45&x=60&y=19

However, not all my load is at the end of the wire. I have 35 taps into the wire at different distances. So, obviously my voltage drop will be less at all lights. The question is, how do I calculate the true voltage drop so that I can determine how to size my power supply? And, in case my 6 wires split off in different directions once they leave the basement panel, will they have the capacity to handle the load?

Thanks,
Ross.

You can google "awg ampacity" and you should get to the Wikipedia page on wire sizes. That will give you resistance of various wire sizes. Resistance of your run X current will give your voltage drop in each segment.

Also note, if you use 24v bulbs and have voltage degrading to 9v at the end of the run, you will have at least a 8:1 ratio of brightness since power (w) is V2/R. Or 625: 81 (8:1) ratio. I don't think you will be happy with the difference.

Actually, the difference will be even more exaggerated because of the lower resistance of the cooler bulbs at the end of the run.

#### EM Fields

Joined Jun 8, 2016
583
Hi. First post here, so bear with me. I am trying to convert my line level (120 VAC) voltage landscape lights to low voltage LED lighting. Based on what I've seen leaving the circuit panel in my basement, I have 6x12 gauge wires feeding about 35 x 50W halogen lights in my yard. My guess is that the cables split into different areas of the yard, but I still have to follow the cables to figure that out. The lights I'm using can take a variable DC voltage in the range of 10-32 VDC. Here is what I'm using:

https://www.amazon.com/gp/product/B01FTIKM5K/ref=oh_aui_search_detailpage?ie=UTF8&psc=1

I have measured the current to be around 1.25 A for these lights.
At what voltage?

#### KeepItSimpleStupid

Joined Mar 4, 2014
4,556
Typical Amazon. You don;t know a lot.

It's 3 W per LED. Is this at 10 V, 12V, 32V or 15.22556 V?
Are the 3 LEDs in series?

I'll BET that if the voltage is between 10 and 32 V, the brightness will be the same, so you don't have a lot to worry about.

From this http://www.powerstream.com/Wire_Size.htm table you can figure it out.

Typically, you would use 2x the wire length as R. You, I think, would have an asymetric I. If they are all in parallel =, vastly different distances, then the total I gets reduced by one (I of lamp) the further away you get from the first bulb.

e.g. 24 lamps #1 100 feet, #23 is 150' and #24 is 200'

before the #1 lamp (100') sees all of the lamps current, but the 23-24 segment sees only the current of one lamp.
Remember there are two wires.

#### KeepItSimpleStupid

Joined Mar 4, 2014
4,556
Anyway, re-explained:
Say 24 LEDS, various distances apart
Say 24 LEDS, 1A @ 12 V, and lets use 1 ohm per 100 feet for the wire.

Remember, this is an example, not real life.

So if the first LED is 50' away, then the drop would be for that segment.

50 feet * 2 paths * 1 ohm/100 ft * 24 leds * 1 Amps = (basically V=IR)

If the last segment was 1 lamp from LED 23 to 24 and it was 100 feet long, you would end up with:

100 feet * 2 paths * 1 ohm/100 ft * 1 LED * 1 Amp = (basically V=IR)

So, the first segment sees the current of all of the LEDS and the 1-2 segment sees 23x the current of one LED etc etc. until the 23-24 segment sees 1x the current of the LED.

Your resistances of the wire will be different because of the lengths. The values you will have will be normalized to ohms/1000 feet.

The equation comes from R=pL/A where p (Rho) is a material property of copper. L is the length and A is the cross-sectional area.

My guess is that a wide range DC-DC converter has a nominal value and a wide range input. e.g. 10 to 32v. With 10-32 V, they might decide to rate current at 12 V in or 24 V in.

#### lsllll

Joined May 31, 2017
5
Thanks KeepItSimple. I think you're right on. I would bet that the units each have a voltage regulator/converter so that it doesn't matter what they're getting fed and they'll keep the same output.

To answer a question a couple of people asked, the 1.25A was at 12VCD. Following your logic, I think I just would have to just sit and calculate the voltage drop for all lights individually, but the first one would be, as you said, dependent on the [current of all lights]*[R to first light]@[the voltage supplied - the voltage drop to the first light], while the last light would be [current of one light]*[R to last light]@[the voltage supplied - the voltage drop to the last light].

I found the actual formula for voltage drop at this site: