Noob Power Supply Question

Thread Starter

xbone

Joined Nov 26, 2018
3
Hello,
This is a very basic question so pardon my naivete. It has to do with real-life adjustable DC power supplies. In most texts, students are introduced to ideal voltage and current sources and if either one is connected to a resistive load, Ohm's law will determine the V-I relationship. Now most of the adjustable DC power supplies I've seen (but never used) seem to double as voltage and current sources as in the photo attached (that's my assumption.) My question is, what happens if a resistor is connected across the output terminal of this power supply and the voltage and current values are set in such a way that they do not conform with Ohm's law? eg if the resistor at the output terminal is 100kΩ and the voltage is set to 10V, by Ohm's law, the current flowing through the resistor should be 0.1mA, but what happens if the power supply current is set to output 1mA instead?


PowerSupply.jpg
 

KeithWalker

Joined Jul 10, 2017
3,063
When you use a power supply with adjustable voltage and current, you first set the required voltage with no load connectedt and then, with the output shorted, you set the current limit.
With any load connected which draws less currant than the set limit, The voltage across the load will be the set voltage. If a load is connected that would draw more than the set current at the set output voltage, the output voltage will drop. The values of voltage and current for any load can be calculated using Ohm's law.
 

dl324

Joined Mar 30, 2015
16,839
Welcome to AAC!
My question is, what happens if a resistor is connected across the output terminal of this power supply and the voltage and current values are set in such a way that they do not conform with Ohm's law?

eg if the resistor at the output terminal is 100kΩ and the voltage is set to 10V, by Ohm's law, the current flowing through the resistor should be 0.1mA, but what happens if the power supply current is set to output 1mA instead?
Assuming you haven't constrained the power supply output voltage, it will be around 30V because it isn't capable of providing the 100V that 1mA through a 100k resistor would require.

I say around 30V instead of exactly 30V because most supplies will go a bit higher than the stated voltage. Particularly at such a low current.
 

Thread Starter

xbone

Joined Nov 26, 2018
3
When you use a power supply with adjustable voltage and current, you first set the required voltage with no load connectedt and then, with the output shorted, you set the current limit.
With any load connected which draws less currant than the set limit, The voltage across the load will be the set voltage. If a load is connected that would draw more than the set current at the set output voltage, the output voltage will drop. The values of voltage and current for any load can be calculated using Ohm's law.
Thanks for the quick response. Are you saying that the value of the current displayed on the power supply is not the output but the max limit?
 

MrChips

Joined Oct 2, 2009
30,704
Thanks for the quick response. Are you saying that the value of the current displayed on the power supply is not the output but the max limit?
Exactly.

What you see on the display settings are maximum limits. This is true for both current and voltage settings.
Ohm's Law still applies.
At anytime, one of the two limits will be reached, whichever comes first. If the voltage limit is reached the PSU is in constant voltage mode. If the current limit is reached the PSU is in constant current mode. You never have both limits triggered at the same time because that disobeys Ohm's Law.

I = V / R

I and V are directly related.
 

AnalogKid

Joined Aug 1, 2013
10,986
Thanks for the quick response. Are you saying that the value of the current displayed on the power supply is not the output but the max limit?
Only when you have intentionally shorted the output. The vast majority of the time, the current disapayed is the output current into the external load/circuit/whatever.

Under normal conditions, both displays indicate the operating condition of the supply. If you increase the load such that the supply goes into current limiting, you will see the current display stop increasing and the voltage display start decreasing. These reflect the conditions at the supply output terminals.

ak
 

MisterBill2

Joined Jan 23, 2018
18,167
A combination regulated power supply will regulate either the output voltage OR the output current, not both at once.. So either the supply will hold a constant voltage with any resistor connected, up to the supply's current limit, OR it will deliver a set current into a load, up to the maximum voltage it can provide, based on the setting of the power supply voltage control.
 
Last edited:

MrSalts

Joined Apr 2, 2020
2,767
OR it will deliver a set current into a load, up to the maximum voltage it can provide.
Voltage will rise to the maximum set voltage, voltage will not go to the maximum rated voltage of the power supply (unless the user sets the voltage to the maximum).
 

MrSalts

Joined Apr 2, 2020
2,767
Thanks for the quick response. Are you saying that the value of the current displayed on the power supply is not the output but the max limit?
Both voltage and current are maximum values. For an In infinite resistance (no load), the current will be zero and voltage will be at the set point. For a short circuit, voltage will fall until to keep the current below the max limit.
 

MisterBill2

Joined Jan 23, 2018
18,167
On the power supplies that I have used, the display indicates actual values of voltage output or current output. Displaying the supply maximum capabilities is not normally useful.
Showing the actual voltage and current is normally what is required.
 

Thread Starter

xbone

Joined Nov 26, 2018
3
Thanks, everyone, and sorry for the late response. So let's say I have a regulated power supply like the one shown in the picture and connect only a 100KΩ resistor across the terminal. Now, what if I want 0.1mA flowing through it? How should I set the voltage and current on the power supply? Ohm's law says the voltage across the resistor should be 10V. Should the settings on the power supply be something like this: current=0.1mA , voltage=10V or more (assuming this ps is capable of producing higher output)?
And if I require 1V across the resistor, I should set the power supply voltage to 1V and current to at least 0.01mA?
 

MisterBill2

Joined Jan 23, 2018
18,167
Thanks, everyone, and sorry for the late response. So let's say I have a regulated power supply like the one shown in the picture and connect only a 100KΩ resistor across the terminal. Now, what if I want 0.1mA flowing through it? How should I set the voltage and current on the power supply? Ohm's law says the voltage across the resistor should be 10V. Should the settings on the power supply be something like this: current=0.1mA , voltage=10V or more (assuming this ps is capable of producing higher output)?
And if I require 1V across the resistor, I should set the power supply voltage to 1V and current to at least 0.01mA?
Because you know the resistance, and because it is much easier to accurately set the voltage, you would set the voltage ti 10.00 volts, and set the current limit to some value above 10 mA because setting a current limit accurately at such a low value will be a challenge on most power supplies. Also, it is simple to set the voltage accurately prior to connecting the load, while setting the current limit on an average power supply requires being able to read that current passing through something. More complex.
This probably does not apply to those power supplies that cost over $6,000.
 

dl324

Joined Mar 30, 2015
16,839
connect only a 100KΩ resistor across the terminal. Now, what if I want 0.1mA flowing through it? How should I set the voltage and current on the power supply? Ohm's law says the voltage across the resistor should be 10V. Should the settings on the power supply be something like this: current=0.1mA , voltage=10V or more
If you want 10V, you set the power supply to 10V. Your supply doesn't have the resolution to indicate a current of 0.1mA.

If you want a constant voltage, you just set the voltage to what you want and then, optionally, limit the current. If you're not going to limit current, which is fine, just make sure it's set higher than any current you expect to require.

If you want a constant current, you set the supply to it's maximum voltage setting, short the output, and set the maximum current. If you want a maximum voltage lower than the what the supply is capable of, set it to that value before you set the current limit.
 

KeithWalker

Joined Jul 10, 2017
3,063
You still appear to be having a problem understanding what the settings on the supply are for. They set the MAXIMUM voltage and current that the supply will provide.
If the voltage is set, it will provide that voltage for any load which draws less than the current set on the supply.
If the current is set, it will provide that current for any load that requires less than the voltage set on the supply.
The supply is usually used as a constant voltage source, with the current limit set to a safe level.
The supply can be used as a constant current source with the voltage limit set to a safe level.
I hope this makes the use of an adjustable power supply a little clearer.
 
Top