Hi,
First thing I wanted to say is I searched the forum for similiar topics and I found some, but none of them made me understand the issue.
So here's what I can't understand:
We use step-up transformer in order to increase voltage on a power line - which voltage do we increase, phase or line voltage?
Increasing voltage gives us smaller current - how? The resistance of the power grid stays the same, so how can we decrease current by increasing voltage? Increased voltage makes the electrons go faster and that makes the greater current. I know, that P=R * I^2, but I don't understand why the power stays the same before stepping-up voltage and after that.
It's like a situation with a 60W bulb. If I apply 20V to it, some amount of current will flow, but the amount of that current is I = U * R, where R is the resistance of that bulb. If I increase the voltage to 40V there will be 2 times the current there was before (because 2*U = 2*I*R). That 60W mark on a bulb shows us only the maximum power it can handle, the power won't always be equal to this 60W, it depends what voltage we apply to it. It might be just 30W or 40W if we apply low voltage.
When I hear about that higher voltage with lower current thing, I think about that bulb described a while ago. If higher voltage gives lower current in power grid, why doesn't it apply to a bulb? Why in the power grid higher voltage gives us smaller current and in the circuit with a buld higher voltage gives us higher current?
Please help me with this issue, because it holds me back and makes it impossible for me to understand more complex things about power transformation.
First thing I wanted to say is I searched the forum for similiar topics and I found some, but none of them made me understand the issue.
So here's what I can't understand:
We use step-up transformer in order to increase voltage on a power line - which voltage do we increase, phase or line voltage?
Increasing voltage gives us smaller current - how? The resistance of the power grid stays the same, so how can we decrease current by increasing voltage? Increased voltage makes the electrons go faster and that makes the greater current. I know, that P=R * I^2, but I don't understand why the power stays the same before stepping-up voltage and after that.
It's like a situation with a 60W bulb. If I apply 20V to it, some amount of current will flow, but the amount of that current is I = U * R, where R is the resistance of that bulb. If I increase the voltage to 40V there will be 2 times the current there was before (because 2*U = 2*I*R). That 60W mark on a bulb shows us only the maximum power it can handle, the power won't always be equal to this 60W, it depends what voltage we apply to it. It might be just 30W or 40W if we apply low voltage.
When I hear about that higher voltage with lower current thing, I think about that bulb described a while ago. If higher voltage gives lower current in power grid, why doesn't it apply to a bulb? Why in the power grid higher voltage gives us smaller current and in the circuit with a buld higher voltage gives us higher current?
Please help me with this issue, because it holds me back and makes it impossible for me to understand more complex things about power transformation.