Transformer: Stepping up voltage: Resistance?

Discussion in 'Homework Help' started by snowfox, May 21, 2008.

  1. snowfox

    Thread Starter Member

    Oct 21, 2007
    General Question I was pondering about stepping up voltage:

    Powerplants step up voltage to transfer power over long distances.

    Reasons why:

    -They can use smaller wire to carry voltage

    -Less Heat Loss

    -Voltage loss wont be as astounding due to such a good voltage:voltagedrop ratio

    Heres what i dont get...

    P= I*V

    When you step up voltage you lower current...

    ohms law:


    so does that mean when when you step up your voltage, you are raising your resistance drastically?

    How does that help voltage travel long distances if the resistance is going to be higher?

    Just getting confused; See if anyone can help me understand this stupid brain fart.

    Please correct if I have stated anything wrong. Thanks in advance guys.
  2. thingmaker3

    Retired Moderator

    May 16, 2005
    That is exactly what it means. :)

    The output impedance of the step up transformer is higher than the input impedance. So the output EMF will be higher, the output current will be lower, and the power will be the same (minus some small unavoidable losses).

    You can read more about transformers here:
  3. snowfox

    Thread Starter Member

    Oct 21, 2007
    Okay this is gonna sound stupid, but:

    Does high output impedance mean that the current-voltage (power) wont travel as far?
  4. Wendy


    Mar 24, 2008
    The resistance in the wires in the transmission lines is a constant. By reducing the current going through them you reduce the voltage dropped across them. The amount of voltage between the two wires is going to be much greater, but doesn't cause loss, just a hazard.
  5. recca02

    Senior Member

    Apr 2, 2007
    Transformer are used to prevent power losses - I^2*R losses.
    Here the R is the resistance of transmission lines.
    For a Power demand of X watts the current required is I amps.
    Thus If we consider the resistance of Load as r. We have I^2*r power(500MW) and I^2*R loss(50 MW). Thus requiring us to generate I^2(R+r).
    What we do with transformers is step this current to a lower value(i) only for the transmission lines, then step it up back as required for loads. thus we now have

    I^2*r(demand-500MW) + i^2*R(loss--50 KW)..

    In true sense nothing happens to the resistance. The transformer Primary 'sees' the resistance on the HV side as higher. This might help you understand it better
    Last edited: May 22, 2008