- Joined Apr 9, 2010
if 40W , 60W , 100W and 200W bulbs are connected in series , which bulb will burn more bright?
This is not right if the lamps are designed to give their rated wattages at the same voltage.All depends on the voltage.
The 200W if it glowed at all because it would get 50% of the available power. The 100W would get a 25%, the 60W 15% and the 40W only 10%.
It is.the answer 40W seems accurate.
What?but point is, when we will add them in series current will reduce than the rated current of bulbs.
Then the highest wattage bulb will have the greatest current flow, and be the brightest.and what, if we connect all of them in parallel keeping voltage constant?
|Thread starter||Similar threads||Forum||Replies||Date|
|B||The Ohm's Law for a LED is Rated at 12V/5A.||Analog & Mixed-Signal Design||13|
|G||Basic ohm's law problem, 2 loads wired in series, 120 volt supply, and many questions.||Homework Help||20|
|A||Ohm's Law and example that doesn't jive||Power Electronics||3|
|Exam question on Ohm's law.||Homework Help||22|
|G||Need help finding the resistor size for a simple circuit (Ohm's law)||Homework Help||9|