Need to calculate voltage drop

Thread Starter

muhammada

Joined Jun 13, 2024
1
I need to connect Distribution Panel (UL891) with Remote Panel supplied by 24VDC from Distribution Panel to Remote Panel with a 300ft long 2 run per polarity (+,-). The load is approximately is 6Amps. I want to calculate voltage on the cables.

1. How to calculate voltage drop for 300feet long 2 parallel cable of 24VDC ?
2. What is best cable size I can use for less voltage drop ?
 

Jon Chandler

Joined Jun 12, 2008
1,570
V = I × R

Find a copper(?) wire table that lists gauge vs. resistance and calculate the voltage drop.

The best gauge for lower voltage drop? The bigger the cable, the lower the resistance and therefore the voltage drop. There is no best gauge – the gauge that gives an acceptable voltage drop is ok.
 

LowQCab

Joined Nov 6, 2012
5,101
I would start with 10-gauge Wire,
and get bigger from there,
if your calculated Voltage-Drop is more than You can tolerate in your application.

"~6-Amps" is rather vague, the characteristics of the Load may play into your decision.
.
.
.
 

WBahn

Joined Mar 31, 2012
32,747
I need to connect Distribution Panel (UL891) with Remote Panel supplied by 24VDC from Distribution Panel to Remote Panel with a 300ft long 2 run per polarity (+,-). The load is approximately is 6Amps. I want to calculate voltage on the cables.

1. How to calculate voltage drop for 300feet long 2 parallel cable of 24VDC ?
2. What is best cable size I can use for less voltage drop ?
What metric is being used to determine "best" cable size? Whatever cable size you consider, a "better" cable size can be had, "for less voltage drop", by just going to a bigger cable. So you need a more useful way to define the criteria by which to determine not when a cable size is "best", but rather when it is "good enough".

Do you know how to calculate the voltage drop across a resistance that is carrying a particular current via Ohm's Law? If not, that is where we need to start the discussion.

Be aware that just knowing the voltage drop across the cable may or may not be sufficient to determine in the cable is good enough. What kind of load are you dealing with? Are there significant start-up currents? Does the load have sharp transients which result in high currents for short amounts of time each cycle?
 

Ian0

Joined Aug 7, 2020
13,112
The figure to remember is that 1mm^2 cable has a resistance of 17mΩ/m.
For 2.5mm^2 cable simply divide that figure by 2.5 etc.
If you live in a country that uses a weird way of defining cable sizes then the job is so much harder.
Determine the percentage loss that is acceptable at full power.
Then calculate the cable resistance that gives that loss.
Divide by the length x2 (because it goes there and back) to give the resistance per unit length.
Divide 17mΩ by the resistance per unit length and that is the cable size required.

Remember that the percentage loss reduces at part load, loss is proportional to the square of current.

Your country's electrical regulations will give minimum cable sizes for various currents which are determined by heating.
These will generally be too small to give an acceptable loss at low voltages.
 

panic mode

Joined Oct 10, 2011
4,920
just put 24VDC PSU at that panel... that way voltage drop is in the AC line.
since the long wires are part of 4x higher voltage, current (and voltage drop) will be small.
also output of regulated PSU will not be affected.
 

panic mode

Joined Oct 10, 2011
4,920
1. How to calculate voltage drop for 300feet long 2 parallel cable of 24VDC ?
2. What is best cable size I can use for less voltage drop ?

lets see... first you need same units...

300ft distance means that total conductor length is 600ft or 600ft*0.3048ft/m = 182.88m

from post #5 we get resistance of 0.0068 Ohm/m for 2.5mm^2 conductors.

you multiply the two and get 1.2436 Ohm
if the current is 6A, that means voltage drop is

V=I*R = 7.46Volt
so if you are transferring 24VDC from a 24VDC supply, you will get 16.54V at the remote panel... Not good.
you can increase wire size from 2.5mm^2 to 16mm^2 to get the voltage drop reduced to 1.17V so you get 22.83V at the remote panel. better but look at the wire size and the cost...

you could save money by choosing different PSU such as 32VDC to get 23.54VDC at the remote end. cheaper and still better result.

you could also use PSU with sense input. then you need to run 4 wires (two of them can be smaller since only used for sensing voltage at remote location). Then PSU will measure voltage at the remote side and boost own output as needed to correct for voltage drop. this way you get actual 24VDC at the output as long as encountered voltage drop is within regulation limit of the PSU.

and as suggested, you could also use the AC to transmit power... since 120VAC is about 4-5 times higher voltage, current will be accordingly smaller, and so is the voltage drop. In other words instead of 7.5VDC drop, you would only get some 1.5-2VAC drop.
and the PSU would produce 24VDC which is regulated locally therefore you do get 24VDC.

this is exactly the same principle that decided the war of currents more than a century ago - use higher voltage for transmission to minimize losses and allow smaller conductor size (more economical solution).
 
Top