Consider a constant supply voltage VS and a consumer with a power rating of P watts. Then they calculate the voltage drop on the cable connecting the supply with the consumer using Ohm's law like this: 1. Calculate current drawn by consumer: I = P/VS 2. Calculate voltage drop on cable: VD = R*I, where R = resistivity*length/area So they assume that the consumer will draw that current. But the current through the circuit is I = VS/(RC+R) and not I=VS/RC = P/VS, RC being the resistance of the consumer. Why they first assume that the current will go through the consumer like all the supply voltage will fall on it? The voltage that the consumer will see is less than the supply voltage. Thus the current drawn is smaller too.