Reducing voltage from a computer power supply

Discussion in 'The Projects Forum' started by Robert W. Boyle, Feb 20, 2009.

  1. Robert W. Boyle

    Thread Starter New Member

    Feb 20, 2009
    2
    0
    I have an old IBM computer power supply; works on AC or 12 V DC. Output is 16.78 VDC. I would like to reduce the output to 15 VDC, to run a newer Toshiba computer from 12VDC in an RV. Can I do this with a resistor? If so how do I calculate the value? Hoping make a short cable with a socket for the old plug - resistor - new plug. Is this a way to go?
    Thanks - Doc
     
  2. leftyretro

    Active Member

    Nov 25, 2008
    394
    2
    The problem with using a resistor is that the desired voltage drop is only valid at one specific current draw and your lappy will vary it's current demand depending on drives running or not, CPU load changes, etc. The easiest solution for your case it to wire two series diodes between the positive output of the supply and the positive input to the laptop. Both diodes wired in the same direction with the cathode end to the laptop. By the way you will have to see if the current capacity for your IBM power supply is equal or higher then what the Toshiba requires, it's not just about the voltage but also the current requirements to insure compatibility.

    That aside I suspect your laptop would most likely run fine just wired to the 16.78 volt as that is just a 12% increase and probably within normal tolerances for the laptop. But a couple of series diodes will drop the voltage around 1.4 vdc. Be sure the diodes current rating are also equal or greater then the max current the laptop will draw.

    Lefty
     
  3. mik3

    Senior Member

    Feb 4, 2008
    4,846
    63
  4. Robert W. Boyle

    Thread Starter New Member

    Feb 20, 2009
    2
    0
    Thanks to Lefty and Mik3, Issue resolved for me. Prompt help greatly appreciated, Great Forum!!
    Doc
     
Loading...