Why is voltage a factor in power?

Thread Starter

jaygatsby

Joined Nov 23, 2011
182
P = EI. But current is a given amount of electrons passing through a given point for a time duration, right? If current is the same with 10V behind it, or 100V behind it, the same amount of electrons are passing through the point for the time duration... so why is voltage considered at all? I know we must have bigger resistors and whatnot for higher power, but why, since the same amount of electrons are passing when the current stays the same? I don't understand why voltage matters aside from pushing the electrons out of the power source in the first place...

Thanks
 

crutschow

Joined Mar 14, 2008
34,285
It's the amount of push as well as the number of electrons that determines the power. The more resistance to the flow of electrons, the more power it takes to move the electrons and that takes more push (voltage). If the current flows without resistance (as in a superconductor) then there is no voltage drop and no power dissipated.

A rough analogy is water in a pipe. The more water pressure, the more work (power) the water can do when it is flowing, as through a turbine. The water (electrons) are just the working fluid to carry the power. The electrons do not have power by themselves.
 

GetDeviceInfo

Joined Jun 7, 2009
2,192
you must first consider what power is.

If you scoop a shovel of sawdust into a wheelbarrow, then you scoop the same volume of gravel into the wheelbarrow. The volume of scoop is analgous to the quantity of current flow. The resistance to lifting is overcome by the force you excert, which is analagous to Voltage. Notice how the resistance of the load changes, but the volume remains the same. Because you exert more force in the later example, more work is being performed.
 

Adjuster

Joined Dec 26, 2010
2,148
More pressure results in higher current for a given resistance. So we could just use current only, since changing the voltage changes the current? Rate of electron flow in terms of amount of electrons and speed is all about current -- the voltage just pushes it to that rate. I am really confused why voltage is considered in the formula for power due to this :(
No, it just does not work (pun!) like that. This is a new subject for you, but the relationship that power is equal to voltage times current is a very long established one, accepted by electrical workers everywhere. It has been known, in the form of Joule's Law, since the middle of the century before last. http://www.wolframalpha.com/input/?i=joule-lenz+law

It may help if you understand the idea of mechanical work, equivalent to force multiplied by distance. Mechanical power then is the rate of doing work, so it is equal to force times speed.

Force by itself does not produce mechanical power - think of a tensed bow, ready to shoot an arrow: until the bowman lets loose, nothing happens.

Similarly, speed by itself is not mechanical power: it is only in the familiar world full of friction that speed implies power. No power is needed to keep the planets moving in the vacuum of space

To develop electrical power, voltage on its own does nothing. A battery may have 12 volts across its terminals, but until current flows from it it will not deliver any power.

The idea of current without power is harder to imagine, but this is because the practical conductors we are used to all have resistance, so in the real world current flow produces at least some voltage, and hence power. In a super-conductor with no resistance, current may flow with zero voltage, and zero power.

Power results from current flowing across a potential difference (that is, flowing across a voltage). http://en.wikipedia.org/wiki/Electrical_power
 

kubeek

Joined Sep 20, 2005
5,794
In a super-conductor with no resistance, current may flow with zero voltage, and zero power.
That is power lost or dissipated power in the wires. The circuit that would be connected through the superconducting wires would still use power as voltage times current.
 

Adjuster

Joined Dec 26, 2010
2,148
That is power lost or dissipated power in the wires. The circuit that would be connected through the superconducting wires would still use power as voltage times current.
True, but in that circuit a voltage drop would occur. The point I am painfully trying to explain to the OP is that both current and voltage are required for electrical power to be dissipated.

The OP is trying to argue that current flow by itself defines power, and voltage need not be considered. He needs to be disabused of this idea.

I am well aware that if the resistance of the circuit is known, the voltage developed can be calculated from the current, V=IR, but we must be careful how we explain this to the OP, who seems to be very confused about the matter.
 

wmodavis

Joined Oct 23, 2010
739
You have to use more than one factor to determine power because that is the way it is defined and with good reason. If you do not want to use voltage and current then use current and resistance. But it takes two. If you don't include voltage in the formula you sited you only have current, not power. Power is not current. They are defined differently and mean something different.
 

thatoneguy

Joined Feb 19, 2009
6,359
Hook up a 100 Watt, 100 ohm resistor to a power source, say, a light bulb.

Feel how hot it is at 1V
Turn the voltage up to 10V, how hot does it get?
How hot does it get at 100V?

Heat dissipated is power, higher voltage will push more current through the same resistor value, thus dissipating more power as heat.
 

Adjuster

Joined Dec 26, 2010
2,148
Here is an illustration of how voltage counts: think about a set of old-fashioned Christmas tree lights. These are little filament bulbs, connected in a line end to end (in series). Suppose there are 20 bulbs, each rated at 6V, 0.1A. Because the lamps are connected in series, the total voltage required is the sum of the voltages of each bulb added together. The total voltage is 20 times 6V = 120V, so they can be fed by a 120V supply.

Different numbers of lamps of a given type need different voltages, e.g. 40 6V lamps would need 240V. Now for the punch line: each bulb has a power of 0.1A * 6V = 0.6W. For the same current of 0.1A, depending on how much voltage is available, a different number of lamps can be lit in series. With 6V, only a single 0.6W lamp can be used, 2 0.6W lamps for 12V... 40 0.6W lamps at 240V. So, for a given current, more voltage equals more power.
 
Top