voltage divider problem

Thread Starter

khatus

Joined Jul 2, 2018
95
Hello guys I have some bulbs like this.



Which takes about 3.8v,0.3A current. Now i wanted to drive 21 bulbs in parallel.But i have only 12v volatge source.So i decided to use a voltage divider . when i calculate the resistor value required for voltage divider i get the following result.



Since each bulb takes 0.3A current and voltage across each bulb is 3.8V.






But when i simulate it in proteus i have seen that increasing the load causes decreasing the output voltage across the resistor R[sub]2[/sub].i.e, the output voltage drops from 3.8V to 3.76V.But my requirement is 3.8V. My question is there anything wrong in my equation or solving method??

 

SamR

Joined Mar 19, 2019
5,031
There is a reason why LEDs are so popular these days... Have you considered what the watts would be on those resistors or where you would source them from?
 

AlbertHall

Joined Jun 4, 2014
12,345
Three of those bulbs in series would need 11.4V and a diode or resistor in series with them would drop the remaining 0.6V without wasting much power.
Repeat that 7 times and that makes 21 bulbs then the final bulb with a suitable series resistor for the last one.
 

ElectricSpidey

Joined Dec 2, 2017
2,758
All of the calculations under the sun wont let you drive a variable load with a voltage divider. (not with a constant voltage anyway)

Use a regulator instead.
 

sparky 1

Joined Nov 3, 2018
756
Each bulb has a value 12.67 Ohm @ (3.8V * 0.3A) = 1.14 Watt.
There can be many different arrays that will have different characteristics as a voltage divider.
Most important is that If you exceed the maximum current then poof! (filament explodes with high temperature event)
If you attain a constant current the supply voltage can change but the voltage divider will be very accurate and will not go poof...
 
Top