What's a good voltage tolerance for a small power supply?

Thread Starter

geratheg

Joined Jul 11, 2014
107
Hey guys!

I'm gearing up to get started learning about electronics and bought a Dynex 1200mah variable AC to DC power supply, and measured the voltages with a multimeter.

For each 3V, 4.5V, 6V, 7.5V, 9V, and 12V setting the reading was about 0.28V higher than it says. So 3V was 3.28V and so on.
The 9V setting had a voltage about 0.36V higher, it read 9.36V.

Are these considered big inaccuracies or small inaccuracies?
Should I worry about these inaccuracies?

What's a good tolerance when it comes to voltages?
Are these values within a good tolerance?

Thank you!
 

crutschow

Joined Mar 14, 2008
34,280
How accurate is your multimeter?

Typically I would think you would want the voltages to be within a couple percent of the setting.
 

Thread Starter

geratheg

Joined Jul 11, 2014
107
I don't know, is there any way to check?

I thought multimeters are pretty accurate?

Edit: I'm not sure if this is an accurate test, but I measured several brand new batteries and got these readings:
Duracell AAA 1.61V
Kirkland AA 1.57V
Energizer C4 1.59V
Duracell 9V 9.58V
Also measured an 18.5 V laptop charger with a reading of 19.54 V
 
Last edited:

studiot

Joined Nov 9, 2007
4,998
What you haven't said is what the conditions of measurement were.

You need to make measurements under load, as well as open circuit.

Commercial battery and power supply testers put a specific suitable load on each battery during testing.

The quality of a supply is determined by its response to increasing load (decreasing load resistance)

The term % regulation is a useful parameter, being the (open circuit voltage - full load voltage) divided by the open circuit voltage, expressed as a %.
 

MrChips

Joined Oct 2, 2009
30,706
You need to double check your voltmeter against a verifiable reference.

One rough test would be to assemble a circuit using a voltage regulator such as an LM7805A.
The output should be within 2% of the specified voltage.
 

crutschow

Joined Mar 14, 2008
34,280
If you are reading a more-or-less consistent bias voltage increase for all the voltages you measure on the power supply, then I suspect you are reading the voltages correctly.

I assume this is an inexpensive wall-wort type supply and you can't expect those to have high accuracy. Those voltages are likely typical for such supplies. They probably set the voltages somewhat high to compensate for any voltage drops when the supply is loaded with current. Try measuring the voltages with a small load, say 100mA.
 

Thread Starter

geratheg

Joined Jul 11, 2014
107
Measured USB output within 1% with a $10 multimeter. It showed 5.05 V.

I guess the power supply has slightly greater voltage to compensate for losses under load.
 

THE_RB

Joined Feb 11, 2008
5,438
Those voltages are very typical.

I've opened up a lot of those little switch-selectable DC PSUs, they usually use a LM317 voltage regulator and a handful of resistors and the switch. And it's typical for the voltages to be a fraction over, when measured under no-load conditions.
 

MrCarlos

Joined Jan 2, 2010
400
Hi there

From another point of view:

You have:
Setting 3V Reading 3.28V, Error % : +9.3
Setting 4.5V Reading 4.78V, Error % : + 6.2
Setting 6V Reading 6.28V, Error % : + 4.6
Setting 7.5 Reading 7.78V, Error % : + 3.7
Setting 9V Reading 9.36V, Error % : + 4.0
Setting 12V Reading 12.28V, Error % : +2.3

If your power supply is as you mention: It is a cheap wall supply.
It is within a tolerable error range
It is within the average range of 5%

Assuming your DMM is pretty good.
 
Top