# voltage divider problem

#### khatus

Joined Jul 2, 2018
85
Hello guys I have some bulbs like this.

Which takes about 3.8v,0.3A current. Now i wanted to drive 21 bulbs in parallel.But i have only 12v volatge source.So i decided to use a voltage divider . when i calculate the resistor value required for voltage divider i get the following result.

Since each bulb takes 0.3A current and voltage across each bulb is 3.8V.

But when i simulate it in proteus i have seen that increasing the load causes decreasing the output voltage across the resistor R[sub]2[/sub].i.e, the output voltage drops from 3.8V to 3.76V.But my requirement is 3.8V. My question is there anything wrong in my equation or solving method??

#### SamR

Joined Mar 19, 2019
2,259
There is a reason why LEDs are so popular these days... Have you considered what the watts would be on those resistors or where you would source them from?

#### AlbertHall

Joined Jun 4, 2014
10,038
Three of those bulbs in series would need 11.4V and a diode or resistor in series with them would drop the remaining 0.6V without wasting much power.
Repeat that 7 times and that makes 21 bulbs then the final bulb with a suitable series resistor for the last one.

#### ElectricSpidey

Joined Dec 2, 2017
1,175
All of the calculations under the sun wont let you drive a variable load with a voltage divider. (not with a constant voltage anyway)