real power

Thread Starter

becca02

Joined Jan 28, 2009
10
My HW problem states:
a) Find the internal resistance of a 60W light bulb. Note: the power rating of a light bulb is determined by measuring the real power dissipated when the standard, 120V/60Hz AC outlet voltage is applied to the bulb.

b) The light bulb is connected in series with a 20Ω resistor and the standard AC outlet voltage is applied to the series in combination. What is the real power dissipated on the light bulb?

c) what is the real power dissipated on the resistor?

So, This is what I got, but im not sure what the difference is between regular power and real power:

a) P = V^2 / R
R = 120^2 / 60W = 240Ω

b)
I first found the current, I
I = V / R (where R = Req = 260Ω)
I = 120V / 260Ω = .462A

Then, real power = (I^2)(R)
so P = (.462A^2)(240Ω) = 51.23W

c) P = (.462A^2)(20Ω) = 4.27W

are these the right answers?
 

t_n_k

Joined Mar 6, 2009
5,455
Your calculation for the light bulb resistance at 120V is fine. Not so for the current. It is more direct to find the current using P=VxI. Hence I = 0.5A. You would expect to use the stated 60W rating of the globe as the starting point.

When the light is connected in series with the 20 ohm resistor the two components must each share a portion of the total 120V applied. Just assuming the light has a resistance of 240 ohm at a lower voltage is incorrect - light globes exhibit a non-linear resistance! The question itself therefore lacks a key piece of information. One would either have to assume a non-linear relationship and solve the problem from there (not a simple algebraic operation) or be provided with a graph of the light globe V-I curve and use a graphical solution to find the required values.
 

mik3

Joined Feb 4, 2008
4,843
If you assume that the light bulb's resistance remains constant then your calculations are correct.

Tnk,

If this is a school question I don't think they need such complicated solutions.
 
Top