I've heard people make claims such as that one could run a house off of the potential difference between the earth (ie. ground) and a tower for example. That is obviously untrue but I was nonetheless unable to formulate a decent explanation for why so. The best I could come up with was an analogous circuit which demonstrates how some huge voltages can in fact be a lesser than others.
The circuit on the left supplies 10kV DC to an LED which requires a 470K resistor to limit the current to roughly 10.6 mV. The one in the middle merely "presents" a voltage of 10kV. (And yet if it were just a dangling wire we might confuse it with the leftmost one.)
The right hand side shows what would happen if an LED were connected to the middle circuit going straight to ground. The voltage drops to less than 2V. (Which by the way may not even be enough to light an *actual* LED due to forward voltage requirements!)
Is there a name for this general principle?
The circuit on the left supplies 10kV DC to an LED which requires a 470K resistor to limit the current to roughly 10.6 mV. The one in the middle merely "presents" a voltage of 10kV. (And yet if it were just a dangling wire we might confuse it with the leftmost one.)
The right hand side shows what would happen if an LED were connected to the middle circuit going straight to ground. The voltage drops to less than 2V. (Which by the way may not even be enough to light an *actual* LED due to forward voltage requirements!)
Is there a name for this general principle?
Last edited: