Cable gauge and power

Thread Starter

Robyn

Joined May 1, 2013
31
Edit: to any newcomer to this thread, I still haven't solved my issue but the thread has helped my formulate it better. So I recommend reading post #13 as I feel it really gets to the heart of my misunderstanding.

Hi everyone,

I am working on an off grid solar setup and I just realised that for a constant amount of amps, the higher the voltage, the smaller the cable gauge can be.

In my instance I am using solar panels with a maximum current output of about 15A and voltage of about 30V.

The Victron toolkit calculator tells me that for a single panel, a 10mm2 cable run of 10m (20m two ways), will drop a dangerous amount of voltage but with 4 panels in series, hence 120V I can go all the way down to 4mm2 while my current remains at 15A.

My experience of electricity is through electronic circuits so this is really surprising to me because resistors are rated for power, so increasing voltage for a constant amount of amps will eventually cause the resistor to release the magic smoke, which is the exact opposite behaviour.

I was expecting cable ratings to follow the same rule, considering a cable as being a very long and low value resistor.

Can someone please explain to me why I was wrong?
 
Last edited:

Reloadron

Joined Jan 15, 2015
7,855
It's more about power expressed in watts than much else.

As to the voltage current. Cable size determines the ampacity the cable can safely handle. Just as an example AWG 12 cable can safely handle 20 amps. This is fine for a short run. However, since the cable has resistance we need to figure that in on a longer run. If my voltage drop gets excessive due to cable length I may want to consider AWG 10 so as to reduce the voltage drop reducing cable resistance. There are dozens of cable resistance and cable ampacity charts available to help a user make an informed decision.

Back to watts which in a simple DC circuit amounts to the voltage * the current. One horsepower is defined as 746 watts. This gets a little hypothetical and does not account for loss. However, a 1 HP motor running on 240 volts would draw about 3.1 amps since 3.1 * 240 = 746 now if I run that same 1 HP motor on 120 volts I get 6.2 amps for the current.

Does this make sense? I tried for a simple explenation.

Ron
 

MrChips

Joined Oct 2, 2009
34,628
You need to compare the current for the same power transferred.

#12 AWG copper cable (4mm squared) can carry 9 A at 120 V = 1080 W
Also, you can calculate the voltage drop using a resistance of 5 Ω per 1000 m.
20 m is 0.1 Ω
The voltage drop is 0.9 V @ 9 A.
 

boostbuck

Joined Oct 5, 2017
1,034
increasing voltage for a constant amount of amps will eventually cause the resistor to release the magic smoke
If you think about ohms law, then increasing the voltage across a resistor but keeping the amps constant will require that the resistance increases in line with the voltage (to limit the current). As the resistance increases the power dissipated also rises - the power is not solely determined by the current.
 

Thread Starter

Robyn

Joined May 1, 2013
31
Like I get that the same voltage drop as a percentage of a higher voltage is a smaller percentage, but the fact is that the cable is dissipating more power overall but it seems I can go for smaller gauge anyway.
 

WBahn

Joined Mar 31, 2012
32,703
Hi everyone,

I am working on an off grid solar setup and I just realised that for a constant amount of amps, the higher the voltage, the smaller the cable gauge can be.

In my instance I am using solar panels with a maximum current output of about 15A and voltage of about 30V.

The Victron toolkit calculator tells me that for a single panel, a 10mm2 cable run of 10m (20m two ways), will drop a dangerous amount of voltage but with 4 panels in series, hence 120V I can go all the way down to 4mm2 while my current remains at 15A.

My experience of electricity is through electronic circuits so this is really surprising to me because resistors are rated for power, so increasing voltage for a constant amount of amps will eventually cause the resistor to release the magic smoke, which is the exact opposite behaviour.

I was expecting cable ratings to follow the same rule, considering a cable as being a very long and low value resistor.

Can someone please explain to me why I was wrong?
So... how would you go about increasing the voltage across a resistor while holding the current constant???

For a constant amount of amps, the voltage doesn't matter in terms of the needed wire size. The current handling capacity of a cable is independent of the voltage feeding it (until you get to issues like insulation breakdown). Now, the FRACTION of power lost in the cable is a different matter. For the same cable, the power lost will be the same at the same current. But the power delivered to the load will go up as the feed voltage goes up, making for a more efficient transfer. Conversely, as the voltage goes down, a larger fraction of the power is lost in the transfer.

The resistance of a 10 mm² copper conductor is about 1.7 Ω/km, so your 20 m total run will equate to about 34 mΩ, resulting in a voltage drop of about 0.5 V and a power dissipation of 7.7 W at 15 A. If your panels are producing 30 V at 15 A, they are producing 450 W total, resulting in about 2% loss. If you put four panels in series and achieve 120 V at that same 15 A, then you are producing 1800 W, but the cable is still dropping 0.5 V and dissipating 7.7 W, but now it only amount to about a 0.5% loss.
 

boostbuck

Joined Oct 5, 2017
1,034
Your calculator is aiming for a specific PERCENTAGE voltage loss along the cable, whereas ohms law is going to calculate an ABSOLUTE voltage loss for a given current. Hence the low volage supply must have a proportionately lower absolute loss to achieve the same percentage loss, requiring a much larger cable.
 

WBahn

Joined Mar 31, 2012
32,703
Like I get that the same voltage drop as a percentage of a higher voltage is a smaller percentage, but the fact is that the cable is dissipating more power overall but it seems I can go for smaller gauge anyway.
What makes you think that the cable is dissipating more power overall?

For the 10 mm² conductor, the voltage drop they show at 30 V is 1.3 V. The votlage drop they show at 120 V is 1.3 V. How is that dissipating more power (in either case, it's (1.3 V)(15 A) = ~20 W).

FWIW, Check that app you are using and make sure that what it means by cable length and what you mean are the same thing. I looks like they are talking about the one-way length (i.e., the length of a cable that has both conductors in it) and not the total conductor length out and back.
 

Thread Starter

Robyn

Joined May 1, 2013
31
There's not much evidence for that claim in what you are saying.
Hahahha fair point!

So... how would you go about increasing the voltage across a resistor while holding the current constant???
Am I wrong in thinking this is what is happening when I hook up my panels in series? Voltages are added but current remains constant?

What makes you think that the cable is dissipating more power overall
I thought that dissipated power was the same as "passed" power. As in the cable will start to radiate heat as soon as any amount of current runs through it, and that it was able to safely withstand that increase in temperature up to a certain point. What I understand from your explanation and @Reloadron's mention of ampacity is that the cable only starts warming up / dissipating once the threshold of ampacity has been passed. Is that correct?
 

WBahn

Joined Mar 31, 2012
32,703
Hahahha fair point!


Am I wrong in thinking this is what is happening when I hook up my panels in series? Voltages are added but current remains constant?
I don't have much experience with solar panels, particularly modules that have any control structures in them, so I don't know how they like being hooked up in series. Certainly the current through series-connected modules will be the same in all modules, but I don't know if they might have a tendency to fight a bit resulting in a lower current overall. Others here have a LOT of experience with solar installations, so they can give you much better guidance on the practical behavior that I can.

I thought that dissipated power was the same as "passed" power. As in the cable will start to radiate heat as soon as any amount of current runs through it, and that it was able to safely withstand that increase in temperature up to a certain point. What I understand from your explanation and @Reloadron's mention of ampacity is that the cable only starts warming up / dissipating once the threshold of ampacity has been passed. Is that correct?
No, the cable starts dissipating power, as heat, as soon at there's current flowing in it and that power increases as the current increases. But only part of the power that is produced and delivered by the source ends up as heat loss in the cabling. Most (hopefully) is passed through (actually via the electromagnetic fields around the cable, but that's a complete other story) to the load being fed by the cable.

The ampacity rating is essentially a limit at which the temperature rise in the conductor will reach the maximum tolerable level (with suitable safety margins incorporated) under specified conditions. The is why the ampacity rating depends on the insulation material (some materials will continue to function to higher temperatures) and installation specifics (cables installed in a bunch of cables in a plenum, for instance, can't shed heat to the surroundings as well as cables that are isolated with free airflow around them, and so reach higher temperatures at the same power dissipation).

To a good approximation, the resistance of the cable is fixed, so the power that is dissipated as heat by the cable is

P = I²R

If the current doubles, the power dissipated goes up by a factor of four (square-law behavior).

This is the big driver for why we use extremely high voltages for power transmission. By increasing the voltage, we decrease the current and, hence, the power lost in the transmission lines. Doubling the line voltage reduces the current by a factor of two, which reduces the line losses by a factor of four. If you increase the voltage by a factor of ten, you reduce the losses by 99%. No figure that long-haul transmission lines are in the hundreds of thousands of volts, so something like a factor of 1000x the end voltage at the service panel, and you see the potential savings. Some of this savings is given up by being able to use smaller conductors, thus saving material costs and weight, which also translates into towers that don't have to be as structurally strong to support them.
 

Thread Starter

Robyn

Joined May 1, 2013
31
Thank you @WBahn for all the details. I really appreciate your patience with this. I'm pretty sure I understand everything that you're saying, and in terms of power sinking it makes a lot of sense. I live on a boat and I understand that a 48VDC system will also allow you to run smaller gauge cable than a 12VDC one because you can get the same amount of power out from using less amps. In that context power remains constant as amps go down with voltage going up. In the case of the solar panels, the screenshots (post #5) do show that power goes up (otherwise there would be no point adding panels).

I was just chatting with someone else, and they understood my conundrum when I took the analogy of a internal combustion engine. An engine with bigger cylinders (amps analogy) will generate more power and more heat, and so will an engine revving faster (voltage analogy). It seems to me that this Victron toolkit result is saying that I can run my engine cooler by running it faster while burning the same amount of fuel at each stroke.

As you say, hopefully someone with solar experience can chime in. In any case thank you all for spending time on this!
 

Thread Starter

Robyn

Joined May 1, 2013
31
OK, so now you have a handle on it right? Fell free when you have questions to ask.

Ron
I'm afraid still not Ron :/, feel free to re-read my last message. I'm saying at the start that I understand the fact that for 2 loads that consume the same power, the higher voltage one will use less amps and power will remain constant. Less amps, less heat, great, but:

In the case of the solar setup an increase in voltage due to series panel doesn't decrease the amps, otherwise no matter how many panels are wired in series the output would always be the same. And I still can't reconcile how a cable conducting just as many amps and more power overall will run cooler...
 

MrChips

Joined Oct 2, 2009
34,628
Let us suppose that you have a 12 W @ 12 VDC light bulb on the boat.
The bulb will need 1 A @ 12 VDC.

Now you want to power the same light bulb from your 48 VDC solar panels. If you connect the bulb directly to 48 VDC the bulb will be very bright and will soon burn out.

You need to reduce the voltage at the bulb. One way of doing that is to insert a power resistor in series with the bulb.
You would need a 36 Ω resistor rated to dissipate in excess of 36 W. The solar panels are still delivering 1 A. The bulb gets 1 A @ 12 V and runs at 12 W. The current in the wires remains the same at 1 A.

The power resistor has to dissipate 36 W while the panels are delivering 48 W. Hence 75% of power is wasted as heat. This set up has an efficiency of 25%.

Now suppose that you implement a device (called a buck converter) to deliver power efficiently at the load (the bulb). The 48 VDC is converted to 12 VDC. The bulb gets 1 A @ 12 VDC. With 100% efficiency, only 0.25 A is needed from the solar panels. For the same power delivered, while the input voltage is 4 times higher, the current is 4 times lower. This allows you to use a smaller diameter wire from the solar panels.

For these same reasons, electricity utilities transmit power at very high voltages, AC and DC. The voltage is stepped down at the consumer or distribution station.
 

drjohsmith

Joined Dec 13, 2021
1,549
@Robyn

Your question seems to be about solar panels
but no mention of batteries

How big a panel setup are you looking at ?

Solar panels only give out power when sun is on them !
Solar panes give out a variable amount of power , dependent upon the sun, temperature and load
That's why the Solar inverters have an optimiser,
they vary the load on the panels to keep the volts out of the panels in the "optimum" range
So the panels on my roof run at around 700 volts DC

High DC voltages also need different cables to those used for AC "mains"

So is this a theory question, or a practical application ?
either way , give us more information on what you what or have will be of use.
 

BobTPH

Joined Jun 5, 2013
11,463
You misunderstanding comes from the fact that you are conflating the voltage of the panels with the voltage across the cable.

When applying Ohm's law to the cable the voltage you use is the voltage drop from one end of the cable to the other. Unless you are shorting the panel with the cable, this is not the same as the panel voltage. The voltage drop across the cable is always the same if the cable is carrying 1A. Therefore, a cable carrying 1A will always dissipate the same power no matter what the panel voltage is.
 

Thread Starter

Robyn

Joined May 1, 2013
31
I can't believe your patience you all :)

I'm talking about the PV cables from the panels to the MPPT. The delivery to batteries after that is irrelevant to my question which is, how the hell is a 10mm2 PV cable OK to conduct 15A@60V but not for 15A@30V.

@BobTPH you are getting very close to making me a happy boy, but if the voltage drop is the same, why is the calculator offering a smaller gauge for the higher voltage setup? Is it a matter of not wasting power rather than not burning the cable? Like would the 10mm2 be OK at 30V safety wise despite the voltage drop representing a bigger portion of 30V than it would of a higher voltage?
 

Attachments

WBahn

Joined Mar 31, 2012
32,703
I'm afraid still not Ron :/, feel free to re-read my last message. I'm saying at the start that I understand the fact that for 2 loads that consume the same power, the higher voltage one will use less amps and power will remain constant. Less amps, less heat, great, but:

In the case of the solar setup an increase in voltage due to series panel doesn't decrease the amps, otherwise no matter how many panels are wired in series the output would always be the same. And I still can't reconcile how a cable conducting just as many amps and more power overall will run cooler...
The cable doesn't really have any awareness of the power that is passing around it, only the current that is flowing in it. Perhaps here it is useful to note that the energy that is flowing from the panels to the load is not actually flowing IN the cable, but rather around it via the electromagnetic fields that are being guided by it.

But if you want a more visually accessible analogy, consider a water pump and a water wheel. We start with a pump that pumps the water up ten feet from the reservoir to a tank. From the tank, there's a pipe that connects to the top of a waterwheel that that is 8' tall that drives some machinery. At the bottom of the water wheel is another pipe that takes the discharge water from the wheel back to the reservoir and to inlet of the pump. Next to it, we build a much larger waterwheel and pump that pumps water up to a height of forty feet to a tank and then there's a pipe that connects it to a waterwheel that is 38' tall. Do I need to use a stronger pipe to connect this higher tank to this bigger waterwheel, if both wheels need the same water flow rate? No. Because, from the pipe's perspective, nothing has changed. The same amount of water is dropping two feet through it.
 
Top