Hi. First post here, so bear with me. I am trying to convert my line level (120 VAC) voltage landscape lights to low voltage LED lighting. Based on what I've seen leaving the circuit panel in my basement, I have 6x12 gauge wires feeding about 35 x 50W halogen lights in my yard. My guess is that the cables split into different areas of the yard, but I still have to follow the cables to figure that out. The lights I'm using can take a variable DC voltage in the range of 10-32 VDC. Here is what I'm using:
https://www.amazon.com/gp/product/B01FTIKM5K/ref=oh_aui_search_detailpage?ie=UTF8&psc=1
I have measured the current to be around 1.25 A for these lights.
The question I have has to do with voltage drop and wire capacity to carry the amperage for the lights hanging off it. Let's not worry about the source voltage for now and assume I can come up with something around 24VDC (2 car batteries, charged during the day, or something similar). Obviously the lights will all be connected in parallel to the source. Assuming that my longest run is 200 feet and that I'm using 3 wires for positive and 3 for negative, my voltage drop at the end, after 35 lights (35 x 1.25 A~=45 A) would be down by 9.5V, which is within acceptable range for me. Here's the calculator I used:
http://www.calculator.net/voltage-d...ce=200&distanceunit=feet&eres=45&x=60&y=19
However, not all my load is at the end of the wire. I have 35 taps into the wire at different distances. So, obviously my voltage drop will be less at all lights. The question is, how do I calculate the true voltage drop so that I can determine how to size my power supply? And, in case my 6 wires split off in different directions once they leave the basement panel, will they have the capacity to handle the load?
Thanks,
Ross.
https://www.amazon.com/gp/product/B01FTIKM5K/ref=oh_aui_search_detailpage?ie=UTF8&psc=1
I have measured the current to be around 1.25 A for these lights.
The question I have has to do with voltage drop and wire capacity to carry the amperage for the lights hanging off it. Let's not worry about the source voltage for now and assume I can come up with something around 24VDC (2 car batteries, charged during the day, or something similar). Obviously the lights will all be connected in parallel to the source. Assuming that my longest run is 200 feet and that I'm using 3 wires for positive and 3 for negative, my voltage drop at the end, after 35 lights (35 x 1.25 A~=45 A) would be down by 9.5V, which is within acceptable range for me. Here's the calculator I used:
http://www.calculator.net/voltage-d...ce=200&distanceunit=feet&eres=45&x=60&y=19
However, not all my load is at the end of the wire. I have 35 taps into the wire at different distances. So, obviously my voltage drop will be less at all lights. The question is, how do I calculate the true voltage drop so that I can determine how to size my power supply? And, in case my 6 wires split off in different directions once they leave the basement panel, will they have the capacity to handle the load?
Thanks,
Ross.