# Why can I use a power supply with a higher current output but the correct voltage rating

#### u0362565

Joined Dec 10, 2019
11
Hi all,

I've read that you can use a power supply with a device if it has the same voltage rating but a higher current output than required. Why is this ok? And does it depend on the device? I.e. are some devices clever and able to reduce the current by increasing resistance?

Thanks for the help

#### dl324

Joined Mar 30, 2015
12,255
Welcome to AAC!
I.e. are some devices clever and able to reduce the current by increasing resistance?
There's nothing clever about the devices. They need a certain amount of current at a specified voltage. The current may very according to what the device is doing, but it stays within the range that the device was designed for. To prevent a device from exceeding a maximum current, fuses are used.

Consider this. If you connect two appliances that run on house current (assume that combined they won't trip the breaker). When they're on, they take the current they need.

#### LesJones

Joined Jan 8, 2017
2,839
For a fixed voltage the current is controlled by the resistance of the device. (Ohms law.) That is a simplified explanation for resistive loads such as heaters but it also applies to more complex loads such as motors incandescent etc. This only applies to constant voltage power supplies. (Which most power supplies are.) The exception is constant current power supplies used for driving LEDs. These give a constant current output with loads within the designed output voltage range.

Les.

#### OBW0549

Joined Mar 2, 2015
3,566
I've read that you can use a power supply with a device if it has the same voltage rating but a higher current output than required. Why is this ok?
The power supply doesn't have a "higher current output"; it has an output which is CAPABLE of delivering a higher current (before it goes into shutdown, or current limiting mode, or simply overheats and catches fire). The power supply's current rating is just that: a rating. If it's rated at 1 amp, that means it's capable of putting out 1 amp and no more. Power supplies don't force current to flow through a load; they maintain a specific output voltage and allow a load to draw current from them at that voltage.

#### u0362565

Joined Dec 10, 2019
11
Thanks all. I like the idea that a PSU is capable of delivering a maximum current but the device doesn't necessarily draw that amount. However, I assume that a device is considered to have a fixed resistance therefore if i swap a 5v 0.5Amp PSU for a 5v 2Amp if the resistance and voltage is constant then why is it the device doesn't draw the maximum current? If we compared to a scenario with water where I have a fixed pressure (5V) and a pipe of set cross section (fixed resistance) will the flow rate (current) not be pre-determined and i will get this rate nothing less from the tap. So I still don't understand why the current through the device isn't at the level that the PSU can output, is the charge not forced through in a sense as determined by the voltage & resistance, the device is passive, its not a pump that sucks at the rate it wants or is it? Sorry I've really dumbed it down now.

#### OBW0549

Joined Mar 2, 2015
3,566
However, I assume that a device is considered to have a fixed resistance therefore if i swap a 5v 0.5Amp PSU for a 5v 2Amp if the resistance and voltage is constant then why is it the device doesn't draw the maximum current?
The current drawn by a resistive load, in amps, is equal to the voltage applied to it by the power source, in volts, divided by the load resistance, in ohms.

Ohm's Law.

Joined Feb 20, 2016
3,657
Go back to the water idea.
If you have a tank with 1 meter head of water, but it is 1 meter in diameter. Add a small hole off 1mm. You will get a certain flow of water.
Now, go to another tank that has 1 meter head but is 20 meters in diameter. A 1mm hole drilled in this tank will give you the same flow as the 1 meter diameter tank, but will be able to supply the flow for a greatly increased time.
The current capacity of the power supply is what it is capable of supplying if asked. The current is dependent on what the load "asks" for, not what the power supply can deliver.
Your car battery can run an LED at a few miliamps, and also many amps for the starter. Same battery, same voltage, different current.
But a 12V doorbell battery, same voltage, can run the LED but not the starter motor. It does not have the current capacity.

You stated.. "However, I assume that a device is considered to have a fixed resistance therefore if i swap a 5v 0.5Amp PSU for a 5v 2Amp if the resistance and voltage is constant then why is it the device doesn't draw the maximum current?"
Voltage constant = 5V
Reststance constant = 50Ohms (for example)

Current = voltage / resistance (I=E/R) This shows my age as voltage was represented by E not V when I learnt this.

5 / 50 = .1
Current = 100mA.

Why can this change when you go to a larger capacity supply? What in the math changes?
The volts remains the same as does the resistance. And the current is dependent on the voltage and the resistance.

Last edited:

#### u0362565

Joined Dec 10, 2019
11
Ok got it now. Current should be considered from the devices perspective and not what the power supply can provide. But that's a nice analogy dendad. The head is the voltage, the hole size the resistance and the flow rate the current. So if we look at this from the point of view of the voltage of the supply can we explain why we shouldn't use too high a supply voltage?

Joined Feb 20, 2016
3,657
Ok got it now. Current should be considered from the devices perspective and not what the power supply can provide. But that's a nice analogy dendad. The head is the voltage, the hole size the resistance and the flow rate the current. So if we look at this from the point of view of the voltage of the supply can we explain why we shouldn't use too high a supply voltage?
Yes indeed.
You just might burst the pipes