Probably a stupid question about DC adapters

Thread Starter


Joined Apr 25, 2014
I have been curiously measuring the output voltages of laptop and wall warts. The laptop ones give a rock steady DC voltage. Some 18v and some 19v. However, if I switch my multimeter's setting to AC instead of DC, in ALL the cases the v shown is almost twice the DC voltage. My question is, this AC voltage is it a ripple or is it normal? How can a 18v DC laptop adapter give out 33v AC? Or am I doing something wrong? I tested with two multimeters.



Joined Aug 21, 2008
It could be that your meter does not block the DC when in the AC scale and is misunterpreting the DC voltage as the average of a rectified AC voltage, which should be a lot higher than the DC voltage.

You can try measuring through a large electrolytic capacitor and see whether that improves things.


Joined Oct 2, 2009
All AC meters are not built alike. The reading shown is influenced by the shape of the waveform. A consistent type of measurement is called the RMS (root mean square). This is the value of a constant DC voltage that has the same heating effect as the AC signal being measured. Your voltmeter may attempt to show RMS but may not always get it right.

How the RMS value is determined by the voltmeter can become a very complicated matter.

An inexpensive DMM (digital multi-meter) may simply rectify the voltage, take the average DC reading and multiply this by a correction factor. This is very likely what your two DMMs are doing. This may be adequate for standard AC line voltages and fails to work for other situations.

The take away here is your DMM is designed to display RMS of line voltages when set to the AC range.
If you are measuring a DC voltage, use the DC range.
If you want to measure the AC ripple from your power supply it is better to use an oscilloscope.