Why does thick wire runs on low voltage ?

Thread Starter

t00t

Joined Jan 22, 2015
73
Hi guys,
I have been told that wires that are thick , like those welding machine earthing wires can only run on 100volts.

Here are my questions:

1) if v = I x R , if I were to have a 230 volts on such thick wire , the current would just have to increase right ?

2) I have also heard that low current running in thick wires can cause a fire . How true is that ?

3) my wires are about 25mm thick. Internal copper is about 11mm thick .

Pleas advice . Thank you .
 

Iam_Pyre

Joined Oct 31, 2017
4
Well, the thicker the wire, the lower the resistance, that is because the free charge particles have more "free" space to move about in the metal lattice. The resistivity equation proves this.

I'm not going to confirm the truth about thick wires catching fire, but, with a given applied voltage, current through a thicker wire would be greater than current though a thinner wire with the same voltage applied across it, theres less internal resistance in the thicker wire.

Whats the application of your wires?
 

Thread Starter

t00t

Joined Jan 22, 2015
73
Well, the thicker the wire, the lower the resistance, that is because the free charge particles have more "free" space to move about in the metal lattice. The resistivity equation proves this.

I'm not going to confirm the truth about thick wires catching fire, but, with a given applied voltage, current through a thicker wire would be greater than current though a thinner wire with the same voltage applied across it, theres less internal resistance in the thicker wire.

Whats the application of your wires?
I am running a 500 amp current through the wire with 23 Vdc .

If the current drops to 200 amp will the wire burn out ?
 

DickCappels

Joined Aug 21, 2008
10,185
No, it wll not burn out at 200 amps if it was fine at 500 amps.

Lower current in a given cable will result in a cooler cable.

For a given amount of power to a load cables tend to get thicker as the voltage is lowered because, to have the same amount of power, current will be higher. The higher the current the more power is lost in the cable, and beyond some point either the voltage at the load is too low (frequently the main consideration) or the cable is not capable of safely handling the current (rarely happens in real life).
 

Thread Starter

t00t

Joined Jan 22, 2015
73
No, it wll not burn out at 200 amps if it was fine at 500 amps.

Lower current in a given cable will result in a cooler cable.

For a given amount of power to a load cables tend to get thicker as the voltage is lowered because, to have the same amount of power, current will be higher. The higher the current the more power is lost in the cable, and beyond some point either the voltage at the load is too low (frequently the main consideration) or the cable is not capable of safely handling the current (rarely happens in real life).
Thank you for the reply .

One question though . If let's say for some unexpected reasons the voltage increases . Then by the V=IR equation , my current has to increase right ?

I am asking this because my load is at 500 amps but my wire current tolerance is at 600 amp .
 

dendad

Joined Feb 20, 2016
4,478
The thing that determines the voltage rating of a cable is not the conductor size but the insulation. The conductor cross sectional area along with the conductor material determines the max current it can carry.
Lowering the current in a given conductor will just let it run cooler.
Thick wires are used for lower voltages usually as the current will have to be higher for a given power than at high voltage. But if you have to run 500Amps at 1000V, you will still need thick wires just like for 500Amps at 12V. But the insulation needed for 1000V must be better than for 12V.
It is quite ok to use a 1000V rated cable at 12V as long as the conductor area is good enough for the current.
If there is a need for sending electricity long distances, usually it is at much higher voltage so the current is lower therefore allowing a smaller area conductor to be used. Long lengths of conductor can cost huge amounts as the metal in it is expensive. So for long runs, it is a trade off between the losses produced by the conductor resistance and the cost of the conductor. High volts means less current so less losses due to the heating of the conductor by the current flowing through the resistance.
 

strantor

Joined Oct 3, 2010
6,798
my wires are about 25mm thick. Internal copper is about 11mm thick .
I have never seen or heard of any kind of wire like this! What on earth is it for? 7mm (>.25") thick insulation is outrageous!
Where I work we use 3,300V wire and its insulation is only 1mm thick, so I assume the excessive insulation isn't for high voltage.
Is this a welding cable intended to be used on job sites, such that you can lay it on top of rocks and gravel and then drive over it with semi trucks?
 

Reloadron

Joined Jan 15, 2015
7,523
I have been told that wires that are thick , like those welding machine earthing wires can only run on 100volts.
Read dendad's post where he mentions:
The thing that determines the voltage rating of a cable is not the conductor size but the insulation. The conductor cross sectional area along with the conductor material determines the max current it can carry.
You may also want to take a look at high voltage cross country power lines which can have voltages exceeding 500,000 volts and be thick. Many high voltage transmission lines do not even have any insulation but rely on large insulators they are suspended from. I always enjoyed this little video and take note of the cable gauge, thick stuff.

Ron
 

DickCappels

Joined Aug 21, 2008
10,185
@strantor , according to post #1 it is a welding cable.

Thank you for the reply .

One question though . If let's say for some unexpected reasons the voltage increases . Then by the V=IR equation , my current has to increase right ?

I am asking this because my load is at 500 amps but my wire current tolerance is at 600 amp .
You have stated ohm's law correctly, but in real life would't the welder adjust the amperage so he can get the kind of weld he wants?

If the voltage increases, the load remains constant, and no ajustments are made the current will increase. It is the job of the welder to make adjustment to get the kind of weld he wants. Correct?
 

strantor

Joined Oct 3, 2010
6,798
@strantor , according to post #1 it is a welding cable.
That's not exactly what he said...
Hi guys,
I have been told that wires that are thick , like those welding machine earthing wires can only run on 100volts.
Per my interpretation, there is a lot of ambiguity here. He could be talking about any old thick wire he found, of which the only closest match in his mental database is like those used on welding machines. The application also sounds ambiguous; most likely welding, but plenty of room to be almost any other thing.
 

MrSoftware

Joined Oct 29, 2013
2,202
I think most people misunderstood the original question, because of the way it was asked. I believe the question is really, "I have been told that thick wires are only good for low voltage, why is this?" And the answer is, this is not true at all. As mentioned above, the insulation has more to do with the voltage rating than the wire. Wires used for high current applications, such as for welders, need to be thick so they can handle lots of current. It just so happens that many of those applications do not involve high voltages, so the wires don't need to be rated for high voltage. If the application required both high current and high voltage, then they would use thick wires and better insulation, so the wires would be both thick and rated for higher voltages. It's about the intended use case.
 

mcgyvr

Joined Oct 15, 2009
5,394
You use the appropriate gauge wire depending on the current going though it.. (See National Electrical Code Table 310.16 for current ratings,etc...)
You choose an insulation rating sufficient (greater than) for the voltage passing through it..

Typically power companies,etc... will up the voltage thus lowering the current so that they can use smaller wire.. Smaller wire = less cost..

I = P/E
Sending power to a 1000W heating element at 10V = 1000/10 = 100 Amps of current down that wire (done with like a 2 AWG wire)
Sending power to a 1000W heating element at 100V = 1000/100 = 10 Amps of current down that wire (done with a smaller and cheaper 16 AWG wire)
 

Tonyr1084

Joined Sep 24, 2015
7,904
If the current drops to 200 amp will the wire burn out ?
If that were true imagine how many wires would be burning out right now because of having NO current through them at all.

Here's an experiment I tried many many years ago with a dead car battery: I took a VERY VERY small (fine) wire and a dead 12 volt battery. It was obviously not completely dead, but as far as car batteries go - it was considered dead because it could no longer start the car. Using a wire (and from memory it must have been small, around 22 gauge) I wanted to see how much power was left in the battery. Being a kid and not knowing better, I shorted the battery positive terminal to the negative terminal through this tiny wire. The wire instantly melted. So I took a large piece of wire from some old house wiring my dad had laying around. It must have been at least 14 gauge and couldn't have been any bigger than 12 gauge. When I shorted the battery the wire got hot but didn't melt. Same battery, same length of wire, just a very small wire and a very large wire. The large wire withstood the dead batterie's power while the small wire absolutely melted the entire length (inside the insulation).

The larger the wire the more current capacity it has. Don't confuse capacity with ohms law. In the real world one has nothing to do with the other. The exception to that is that if you're running a circuit with 230 volts (from your original post) and a resistive load such as a light bulb (lets assume it's a 100 watt bulb) Ohms law says that at 100 watts and 230 volts the current would be 435 milli-amps. (ignoring the startup current) If you use a 20 gauge wire for this circuit everything would be fine (and assuming the wire insulation could withstand the 230 volts). If you were to hook up this circuit using 14 gauge wire there would be virtually no change to the circuit and the performance. The 14 gauge wire WILL (technically) run cooler, but you'd be hard pressed to detect the difference (my opinion).

Your house is typically wired with 12 and 14 gauge wire. For an electric stove or electric dryer, the gauge may be as big as 6 gauge because of the greater current needed by those appliances. But lets examine your living room. Typically wired with 12 gauge wire (rated to handle 20 amps). Why doesn't your house burn down when you turn on the lamp? Only the lamp? It's drawing way less than 20 amps. So lower current would not be detrimental to larger gauge wire.
 
Last edited:

Thread Starter

t00t

Joined Jan 22, 2015
73
@strantor , according to post #1 it is a welding cable.



I am under the imprssion that if the voltage rises the operator will ajust the amperage to obtain the desired welding characteristics.

If the voltage increases, the load remains constant, and no ajustments are made the current will increase. It is the job of the welder to make adjustment to get the kind of weld he wants.
Thanks for all the replies up to this point .

I think I wasn't clear on my first post . What I meant was the cable size are as thick as those earth welding cables .

But it's used for the aerospace industry where they use this a very high amp to start the jet engines for testing purposes .
 

GopherT

Joined Nov 23, 2012
8,009
Thanks for all the replies up to this point .

I think I wasn't clear on my first post . What I meant was the cable size are as thick as those earth welding cables .

But it's used for the aerospace industry where they use this a very high amp to start the jet engines for testing purposes .

All cables (wires) have some resistance. The thick wires can be used at any voltage just like thin wires can be used for any voltage (the acceptable voltage depends on the insulation)

The thick cables are used for low voltage because the PERCENTAGE of power loss is greater in low voltage system.

If a pair of 20 foot 12-gauge cable has 0.066 ohms of Resistance (total to + from the load). A 12V supply pushing 30 amps through the cable will cause a 2 volt drop across the length of the cable to the . So your load will only have 10 volts.

Repeat for a 120 v supply, the 30 amp load will get 118 volts. Un-noticeable.

Remember ohms law, voltage drop of the cable will drop Amps x Resistance = volts.
 
Last edited:

Thread Starter

t00t

Joined Jan 22, 2015
73
All cables (wires) have some resistance. The thick wires can be used at any voltage just like thin wires can be used for any voltage (the acceptable voltage depends on the insulation)

The thick cables are used for low voltage because the PERCENTAGE of power loss is greater in low voltage system.

If a pair of 20 foot 12-gauge cable has 0.066 ohms of Resistance (total to + from the load). 30 amps running through the cable will cause a 2 volt drop across the length of the cable to the . So your load will only have 10 volts.

Repeat for a 120 v supply, the 30 amp load will get 118 volts. Un-noticeable.

Remember ohms law, voltage drop of the cable will drop Amps x Resistance = volts.
Thanks for all the replies up till now .

One question is wouldnt the voltage drop across wires be proportional to its length ?

If so is there any formula for this ?
 

GopherT

Joined Nov 23, 2012
8,009
Thanks for all the replies up till now .

One question is wouldnt the voltage drop across wires be proportional to its length ?

If so is there any formula for this ?
Yes, length and current.

Look up AWG (American wire gage) in google. You will get a Wikipedia article that shows the resistance of various wires per linear foot. Once you know the resistance, the amps flowing through it, you can calculate the voltage drop.
 

strantor

Joined Oct 3, 2010
6,798
If a pair of 20 foot 12-gauge cable has 0.066 ohms of Resistance (total to + from the load). 30 amps running through the cable will cause a 2 volt drop across the length of the cable to the . So your load will only have 10 volts.
I think your statement is missing something. Did you mean to say 12V at 30A?
 
Top