What's a good voltage tolerance for a small power supply?

Discussion in 'General Electronics Chat' started by geratheg, Jul 16, 2014.

  1. geratheg

    Thread Starter Member

    Jul 11, 2014
    Hey guys!

    I'm gearing up to get started learning about electronics and bought a Dynex 1200mah variable AC to DC power supply, and measured the voltages with a multimeter.

    For each 3V, 4.5V, 6V, 7.5V, 9V, and 12V setting the reading was about 0.28V higher than it says. So 3V was 3.28V and so on.
    The 9V setting had a voltage about 0.36V higher, it read 9.36V.

    Are these considered big inaccuracies or small inaccuracies?
    Should I worry about these inaccuracies?

    What's a good tolerance when it comes to voltages?
    Are these values within a good tolerance?

    Thank you!
  2. crutschow


    Mar 14, 2008
    How accurate is your multimeter?

    Typically I would think you would want the voltages to be within a couple percent of the setting.
  3. geratheg

    Thread Starter Member

    Jul 11, 2014
    I don't know, is there any way to check?

    I thought multimeters are pretty accurate?

    Edit: I'm not sure if this is an accurate test, but I measured several brand new batteries and got these readings:
    Duracell AAA 1.61V
    Kirkland AA 1.57V
    Energizer C4 1.59V
    Duracell 9V 9.58V
    Also measured an 18.5 V laptop charger with a reading of 19.54 V
    Last edited: Jul 16, 2014
  4. studiot

    AAC Fanatic!

    Nov 9, 2007
    What you haven't said is what the conditions of measurement were.

    You need to make measurements under load, as well as open circuit.

    Commercial battery and power supply testers put a specific suitable load on each battery during testing.

    The quality of a supply is determined by its response to increasing load (decreasing load resistance)

    The term % regulation is a useful parameter, being the (open circuit voltage - full load voltage) divided by the open circuit voltage, expressed as a %.
  5. MrChips


    Oct 2, 2009
    You need to double check your voltmeter against a verifiable reference.

    One rough test would be to assemble a circuit using a voltage regulator such as an LM7805A.
    The output should be within 2% of the specified voltage.
  6. crutschow


    Mar 14, 2008
    If you are reading a more-or-less consistent bias voltage increase for all the voltages you measure on the power supply, then I suspect you are reading the voltages correctly.

    I assume this is an inexpensive wall-wort type supply and you can't expect those to have high accuracy. Those voltages are likely typical for such supplies. They probably set the voltages somewhat high to compensate for any voltage drops when the supply is loaded with current. Try measuring the voltages with a small load, say 100mA.
  7. geratheg

    Thread Starter Member

    Jul 11, 2014
    It is a cheap wall supply.

    Thanks for all the help.
  8. geratheg

    Thread Starter Member

    Jul 11, 2014
    Measured USB output within 1% with a $10 multimeter. It showed 5.05 V.

    I guess the power supply has slightly greater voltage to compensate for losses under load.
  9. THE_RB

    AAC Fanatic!

    Feb 11, 2008
    Those voltages are very typical.

    I've opened up a lot of those little switch-selectable DC PSUs, they usually use a LM317 voltage regulator and a handful of resistors and the switch. And it's typical for the voltages to be a fraction over, when measured under no-load conditions.
  10. MrCarlos

    Active Member

    Jan 2, 2010
    Hi there

    From another point of view:

    You have:
    Setting 3V Reading 3.28V, Error % : +9.3
    Setting 4.5V Reading 4.78V, Error % : + 6.2
    Setting 6V Reading 6.28V, Error % : + 4.6
    Setting 7.5 Reading 7.78V, Error % : + 3.7
    Setting 9V Reading 9.36V, Error % : + 4.0
    Setting 12V Reading 12.28V, Error % : +2.3

    If your power supply is as you mention: It is a cheap wall supply.
    It is within a tolerable error range
    It is within the average range of 5%

    Assuming your DMM is pretty good.