Does the Overall Power Draw Matter in terms of a wire melting?

Thread Starter


Joined Mar 31, 2020
I am running some tests on a few connectors we have. What I would like to test them at is about 24V@5A DC. The best I can get right now is about 16V @5A.

I've done testing like this before and usually, I just follow the specs provided so I don't even bother.

What I understand is that the voltage pertains to the insulation due to arcing and the like.

The current is what slams into the wire and generates the heat/melting etc.

Multiple conductor cables have lower amperage limits due to this (This is what I am testing around, using only 2 instead of 5 conductors.)

My worry is my current test is only pushing about 80W through. What I want to test is a little over 120W, however, it would just be an increase in the voltage, not the current.

Any clarifications on this would be amazing! My current test rig is just a couple of 25W resistors in parallel/series.



Joined Mar 14, 2008
Current is what determines a connecter power rating.
The voltage value just specifies the insulation rating.

So if you are testing for current rating, the voltage used has no effect on that.
So you can test at a voltage just sufficient to generate the current you want.

If you want to test for the voltage rating, that is normally done at little or no current through the connector.

Thread Starter


Joined Mar 31, 2020
Thanks for the input! I was just overthinking it I guess, I was having trouble separating the power the wire was dissipating vs the power dissipated by the load.



Joined Jun 19, 2012
The test voltage is relevant only if you are breaking the live circuit, the contact points will then arc if the voltage is high.

Of course at some much higher voltage level the overall insulation will break down, but i doubt that's an issue here.