Volts, amps and watts relation

Thread Starter

knowledgethirst

Joined Oct 8, 2007
5
Even though Ohm's law reveals this, I can not get to the point to make sense of the following: A 100 watt light bulb operating on 110 volts will draw 0.90909 Amp. I know this amp number represent a physical number of moving electrons per second. They are in this circuit making possible those 100 watts of energy the bulb is producing. Now, when I change to 220 volts the light bulb draws only 0.45455 Amp. Lesser amount of electrons in the same amount of time but same amount of work is done. Could anybody help me sort this out? Thanks
 

Thread Starter

knowledgethirst

Joined Oct 8, 2007
5
When you mention "pushed twice as hard", it has nothing to do with electrons rate or speed, right? So far I know they move at a slow rate no matter what voltage. Or you try to say that each electron moving at the same speed could potentially carry different amounts of energy at different times but it depends on the voltage applied. Is my understanding inline with your explanation?
 

recca02

Joined Apr 2, 2007
1,212
rate of electron flow is self explanatory ,
it is a combination of speed as well as no of electrons.
a high voltage means a at high potential energy.
potential energy is converted into kinetic energy by this potential difference.
electron speed is limited due to collisions and they loose their KE which is turns up as heat energy for bulb. so for a given amount of power the resistance is changed in accordance with voltage. resistance plays an important role as to how much energy will be dropped and voltage is the measure of potential energy.
 

KrisKizlyk

Joined Jan 6, 2007
8
Look at it this way...you have a higher potential energy at 220V than 110V. So it has to take more current from the source to apply the same potential energy.

Example...if you put a rock half way up a cliff and put one all the way at the top, of course the top one has a higher potential....same principal.
 

bloguetronica

Joined Apr 27, 2007
1,541
Hold the phone... I think we all missed something here:
Have you actually measured amperage? Are you sure it isn't 1.81818 Amps? Are you sure it isn't 400 watts at that point?
He is considering two different 100W lightbulbs that work under different voltages. Assuming that, he is correct.

Of course if you put a 100W bulb that works under 110V and subject it to 220V, the bulb will dissipate 400W. The bulb would need twice the resistance to dissipate the same.
 

cherry886

Joined Apr 23, 2008
2
This is Cherry from China, I am a new comer to stand here, I want to learn some knowlege about electrical equipment, and make friends with us. Many thanks for all.
 

Caveman

Joined Apr 15, 2008
471
you say that at high voltage current is decreased but in my knowledge

V = I R
the relation of bv and I is directly proportional
am right?

You are right, if you assume R is constant. He is being ambiguous, but I'm pretty sure he intended to state that he is maintaining a constant power, not resistance. In this case,
P = V I
is the correct equation.
 

beenthere

Joined Apr 20, 2004
15,819
The confusion seems to me to come from the two cases where it is the same lamp. but it's resistance changes with voltage applied. This is incorrect, as you have to choose a lamp rated for the line voltage.

He is correct about the power being the same in both cases, because the current in the 220 volt lamp is half the current in the 110 volt lamp.
 
Top