- Joined Apr 9, 2010
if 40W , 60W , 100W and 200W bulbs are connected in series , which bulb will burn more bright?
This is not right if the lamps are designed to give their rated wattages at the same voltage.All depends on the voltage.
The 200W if it glowed at all because it would get 50% of the available power. The 100W would get a 25%, the 60W 15% and the 40W only 10%.
It is.the answer 40W seems accurate.
What?but point is, when we will add them in series current will reduce than the rated current of bulbs.
Then the highest wattage bulb will have the greatest current flow, and be the brightest.and what, if we connect all of them in parallel keeping voltage constant?
|Thread starter||Similar threads||Forum||Replies||Date|
|J||Misunderstanding Ohm's law||General Electronics Chat||27|
|Help me apply Ohm's Law to reduce brightness of an LED lamp||General Electronics Chat||7|
|R||Ohm's law and power equation||Homework Help||4|
|B||The Ohm's Law for a LED is Rated at 12V/5A.||Analog & Mixed-Signal Design||13|
|G||Basic ohm's law problem, 2 loads wired in series, 120 volt supply, and many questions.||Homework Help||20|
by Robert Keim
by Aaron Carman
by Duane Benson