# Power supply Help

Discussion in 'The Projects Forum' started by StephenMG, Feb 7, 2008.

1. ### StephenMG Thread Starter Member

Feb 7, 2008
10
0
Hi,

I hope someone can help. I have a power supply with the following output specs.

Voltage: 30 VDC Current 1560 milliamps

I wanted to alter the current to a maximum of 1.5 milliamps. How do I do this?

I think If I solder a resister to one side of the power cord it will drop the current but leave the voltage the same, which is what I want. But I don't know what size resister to use to do this.

Thanks,
StephenMG

2. ### beenthere Retired Moderator

Apr 20, 2004
15,815
283
For a current limiting resistor, use Ohm's law. R = E/I. That comes out to a nice round 20,000 ohms for 30V/.0015A.

3. ### StephenMG Thread Starter Member

Feb 7, 2008
10
0

Ok, of if I use a 20k ohm resister it will drop the current to 1.5 milliamps but leave the volts at 30 VDC right? Also, what watt resistor should I get. I don't want to over-heat the resister and burn it out.

Yes, John I would solder the resistor to one side of the DC output cord.

Thank you again everyone

4. ### SgtWookie Expert

Jul 17, 2007
22,183
1,728
If the impedance of the power supply is near zero, and you connect a 20K Ohm resistor across the output terminals, you should still read 30V across the resistor, and 1.5mA will be flowing through the resistor.

If you measure less than 30V at the output after connecting the resistor, I'd be very surprised.

You figure power in Watts using Ohm's Law.
P = E x I
P = 30V x 0.0015A
P = 0.045 Watts
That's roughly 1/22 Watts.

5. ### Ron H AAC Fanatic!

Apr 14, 2005
7,050
657
It will leave the voltage at 30V until you put a load on it. If you want a supply that will put out a constant 30V for any current less than 1.5mA, it gets way more difficult. Now, if you wanted, say, 25V at a maximum current of 1.5mA, that would be fairly easy.

6. ### Ron H AAC Fanatic!

Apr 14, 2005
7,050
657
The 20k resistor will indeed draw 1.5mA, but it won't limit the current through any other device you connect across the 30V supply.
What are you actually trying to do?

7. ### StephenMG Thread Starter Member

Feb 7, 2008
10
0
Hi Ron,

My friend built a device and now needs a power supply for it. He needs the power supply to provide a constant 32 VDC at a maximum current of 1.5 milliamps.

So, I have a 32 VDC power supply that puts out 1560 milliamps and I need to drop the maximum current to 1.5 milliamps.

I figured that if I use 1-20k ohm, 1-300 ohm and 1-33 ohm resistor soldered to one side of the output cord that it would be very close to a maximum current draw of 1.5 milliamps.

Is this correct? or am I all messed up

Thank you guys for helping me so much,
StephenMG

8. ### Ron H AAC Fanatic!

Apr 14, 2005
7,050
657
The device will only draw as much current as it needs. The current rating on a power supply is the maximum current it can supply.

9. ### StephenMG Thread Starter Member

Feb 7, 2008
10
0
Ok, that I understand but this device based on a given condition will begin drawing more current and he only wants it to draw a maximum of 1.5 milliamps.

10. ### Ron H AAC Fanatic!

Apr 14, 2005
7,050
657
Can he live with less than 32V? How about 30V?

EDIT: Here's a circuit that puts out (according to simulation) about 31.9V with no load, and 31.3V at 1.5mA. the voltage starts to drop like a rock at 1.593mA, and is zero at 1.595mA. The current limit is dependent primarily on the value of R1, and to a lesser degree on R2.

File size:
13.2 KB
Views:
16
File size:
21.2 KB
Views:
9