Your basic premise is incorrect. You are assuming that the purpose of the wire is to limit the current to hold the voltage constant. In fact, it is the wire resistance that is (relatively) constant, and the voltage and current vary around it according to a permutation of Ohm's Law: R = E / I For a constant resistance, like a fixed length of wire, if you increase the voltage, the current increases also.I think I'm missing something here, how can they transmit at a high voltage and low current when there needs to be a high resistance according to 'V = IR', where does the high resistance come from?
More important is Watt's Law, P = E x I This is what drives power company's designs. If you want to deliver 2000 W, this can be 1000 V at 2 A or 100 V at 20 A. With just this equation there is no implicit advantage to either high or low voltage. But next comes Joule's Law, P = I^2 x R This falls out from a combination of Ohm's and Watt's Laws. With Joule's Law, it is clear that for any particluar current in a wire (the wire is the resistor in this case), as the resistance increases so does power dissipated in the wire. This is why there is an advantage to high voltage/low current when transporting large amounts of power over long distances. In the previous example, if the 2000 watts is going through a wire with 2 ohm resistance, then for 1000 V at 2 A, the loss in the wire is 8 W, but for 100 V at 20 A, the loss is 800 W.
The important relationship is that the power loss in a resistor or wire increases as the square of the current. That adds up.
ak