if 40W , 60W , 100W and 200W bulbs are connected in series , which bulb will burn more bright?
This is not right if the lamps are designed to give their rated wattages at the same voltage.All depends on the voltage.
The 200W if it glowed at all because it would get 50% of the available power. The 100W would get a 25%, the 60W 15% and the 40W only 10%.
It is.the answer 40W seems accurate.
What?but point is, when we will add them in series current will reduce than the rated current of bulbs.
Then the highest wattage bulb will have the greatest current flow, and be the brightest.and what, if we connect all of them in parallel keeping voltage constant?
Thread starter | Similar threads | Forum | Replies | Date |
---|---|---|---|---|
Exam question on Ohm's law. | Homework Help | 22 | ||
G | Need help finding the resistor size for a simple circuit (Ohm's law) | Homework Help | 9 | |
Ohm's law with mesh and nodal analysis | Homework Help | 24 | ||
Ohm's Law - why LED always require resistor | Test & Measurement | 26 | ||
L | Ohm's law and heat generation | General Electronics Chat | 27 |
by Jake Hertz
by Jake Hertz
by Kate Smith