I am trying to find a clear explanation on how to calculate how much of a voltage drop there is for a given lead length.
If I am powering a 24V light that requires 0.5A using a 24Vdc power supply, how much of a voltage drop is there if I'm using 22AWG (0.33mm^2) copper wire for 100 meters? Google says copper has a resistivity of 1.68e-8.
What do you think about this calculation on this Redarc website: LINK
I'm not sure why their calculation is using 0.017 rather than 1.68e-8.
Can someone give me an intuitive explanation and relate it to ohms law?
If I am powering a 24V light that requires 0.5A using a 24Vdc power supply, how much of a voltage drop is there if I'm using 22AWG (0.33mm^2) copper wire for 100 meters? Google says copper has a resistivity of 1.68e-8.
What do you think about this calculation on this Redarc website: LINK
I'm not sure why their calculation is using 0.017 rather than 1.68e-8.
Can someone give me an intuitive explanation and relate it to ohms law?