Voltage divider Circuit

Thread Starter

jumbo

Joined Dec 4, 2007
2
I have a DC adapter of output 7.6V, 600mA. Taking this I'm trying to get 3V 600mA. I thought just putting a voltage divider circuit at the output of the adapter would do the trick. I made the divider circuit by using 680 ohms and 1k Ohms, taking the voltage across the 680 as my new output. Well, I get the desired 3V, but I do not get the current. I would still be ok if the current output dropped to 300mA. My load consumes about 200mA. Could anyone please help me on this? I really appreciate any help.
 

thingmaker3

Joined May 16, 2005
5,084
When you hook up your voltage divider to a load, the load will become part of the voltage divider. This will alter you voltage divider values.

I suggest you consider using a linear voltage regulator, such as the venerable LM317.
 

Ron H

Joined Apr 14, 2005
7,014
They problem is that the 680 ohm resistor resist to current.
Using different values you could get 3v at 600mA.
Using this voltage divider calculator. the results are R1=7.66 ohm and R2 =5ohm
But if you add a 200mA load to this, the voltage will drop to 2.4V. You will be wasting a lot of power trying to do it this way, and your output voltage will still vary as the load varies. Forget the voltage divider and use an LM317.
 

Ron H

Joined Apr 14, 2005
7,014
Or maybe the LM350
Btw, how you calculate that adding 200mA the voltage will drop to 2.4v?
The Thevenin equivalent resistance is 7.66 ohms in parallel with 5 ohms, which equals 3 ohms. In other words, we have a 3 volt source with 3 ohms series resistance. 200mA through 3 ohms will drop 0.6V.
3V - 0.6V = 2.4V.
An LM350 will work, but it's overkill for a 200mA load.
 
Top