Reducing voltage from a computer power supply

Thread Starter

Robert W. Boyle

Joined Feb 20, 2009
2
I have an old IBM computer power supply; works on AC or 12 V DC. Output is 16.78 VDC. I would like to reduce the output to 15 VDC, to run a newer Toshiba computer from 12VDC in an RV. Can I do this with a resistor? If so how do I calculate the value? Hoping make a short cable with a socket for the old plug - resistor - new plug. Is this a way to go?
Thanks - Doc
 

leftyretro

Joined Nov 25, 2008
395
I have an old IBM computer power supply; works on AC or 12 V DC. Output is 16.78 VDC. I would like to reduce the output to 15 VDC, to run a newer Toshiba computer from 12VDC in an RV. Can I do this with a resistor? If so how do I calculate the value? Hoping make a short cable with a socket for the old plug - resistor - new plug. Is this a way to go?
Thanks - Doc
The problem with using a resistor is that the desired voltage drop is only valid at one specific current draw and your lappy will vary it's current demand depending on drives running or not, CPU load changes, etc. The easiest solution for your case it to wire two series diodes between the positive output of the supply and the positive input to the laptop. Both diodes wired in the same direction with the cathode end to the laptop. By the way you will have to see if the current capacity for your IBM power supply is equal or higher then what the Toshiba requires, it's not just about the voltage but also the current requirements to insure compatibility.

That aside I suspect your laptop would most likely run fine just wired to the 16.78 volt as that is just a 12% increase and probably within normal tolerances for the laptop. But a couple of series diodes will drop the voltage around 1.4 vdc. Be sure the diodes current rating are also equal or greater then the max current the laptop will draw.

Lefty
 
Top