Digital/Analog Multimeter, Frecuency Limit in AC reading

Thread Starter

igtaba

Joined Apr 4, 2018
1
Hello people, I'm new to the forum and I have a simple question for a university laboratories report about measuring instruments


A multimeter, either analogue or digital has a limit in frequency for readings in AC Voltage. I want to know why there is such a limit.


I believe that in the digital multimeter is because the multimeter has to rectify the AC wave and in doing that applies some capacitors to reduce the ripple in the rectified DC voltage so we obtain sometime like an almost rectified wave with a period, but if the frequency is greater than the limit the samples taken is wrong because it takes no just the readings of one period of the almost DC rectified signal, but actually takes values of the other periods if the frequency is too high.

I know in the analogue multimeter there is a rectifier like in the digital, but I don’t know what the physical effect is over the needle and thus the necessity of the limit


I would appreciate a lot any help over this question or comment over any of the limitations of a multimeter either analogue or digital, and pardon me in advance for my bad English, is not my native language
 

Hymie

Joined Mar 30, 2018
1,277
Hello people, I'm new to the forum and I have a simple question for a university laboratories report about measuring instruments


A multimeter, either analogue or digital has a limit in frequency for readings in AC Voltage. I want to know why there is such a limit.


I believe that in the digital multimeter is because the multimeter has to rectify the AC wave and in doing that applies some capacitors to reduce the ripple in the rectified DC voltage so we obtain sometime like an almost rectified wave with a period, but if the frequency is greater than the limit the samples taken is wrong because it takes no just the readings of one period of the almost DC rectified signal, but actually takes values of the other periods if the frequency is too high.

I know in the analogue multimeter there is a rectifier like in the digital, but I don’t know what the physical effect is over the needle and thus the necessity of the limit


I would appreciate a lot any help over this question or comment over any of the limitations of a multimeter either analogue or digital, and pardon me in advance for my bad English, is not my native language


For the most part, I suspect the major factor (which limits the frequency of measurement) is the stray capacitance within the measurement circuit – although component frequency limitations may pay a part.

Any reasonable digital multimeter will have an input impedance of 10M ohm.

At 10kHz a 10pF capacitance has an effective impedance of 160k ohm, with instrument specifications having claimed accuracy of better than 2%, stray capacitance will have unwanted effects on the voltages present within the measurement circuit – especially where high impedance circuits are involved.

Looking at the specification for Fluke bench multimeters, I see that some models have ac voltage measurements up to 100kHz – but these are high end units; most cheap handheld multimeters will be limited to 1kHz (and possibly less).
 

danadak

Joined Mar 10, 2018
4,057
If the meter is True RMS AC reading then probably its the OpAmp circuit speed
limitations for the RMS converter circuit, eg. he speed of the OpAmp.

Regards, Dana.
 

Hymie

Joined Mar 30, 2018
1,277
If the meter is True RMS AC reading then probably its the OpAmp circuit speed
limitations for the RMS converter circuit, eg. he speed of the OpAmp.

Regards, Dana.
Although I mentioned component frequency limitations might play a part – even the humble 741 op-amp has a slew rate of 0.5V/us. If we consider a sine wave within the voltage limits of a 9V battery, theoretically it could achieve a frequency around 100kHz. But at this frequency any stray capacitance of 10nF would have an impedance of 16K ohm.

Imagine building a circuit that measured the required input voltage; then at any adjacent points of the circuit, bridge those points with a 16k ohm resistance. It would be a well designed circuit that behaved in the same manner regardless of whether the input voltage was at 50Hz or 100kHz considering the varying loading effects of the stray capacitance, as the frequency increases.
 

MisterBill2

Joined Jan 23, 2018
18,175
The variation of the reading with frequency is due to the reactance of any capacitors in the system at other than the intended operating frequency. Also, given that other than a perfect sine wave contains harmonics, the readings will also be affected by the waveform.

At much higher frequencies stray capacitance also comes into play.
 

MrChips

Joined Oct 2, 2009
30,711
Meters with an AC voltage range was originally intended to measure mains voltage, 50 or 60Hz.
Anything above that and all bets are off. Read the spec sheet of the meter to be sure.

Obviously, there has to be a max frequency limit. Where that limit lies depends on the meter design, capacitance, active components, opamps, filters, etc.

Also an important consideration is how the meter measures AC. Is it true RMS?
 

crutschow

Joined Mar 14, 2008
34,283
even the humble 741 op-amp has a slew rate of 0.5V/us. If we consider a sine wave within the voltage limits of a 9V battery, theoretically it could achieve a frequency around 100kHz.
At 100kHz the maximum slew limited peak signal would be about 0.8V.
If the meter is True RMS AC reading then probably its the OpAmp circuit speed
limitations for the RMS converter circuit, eg. he speed of the OpAmp.
A meter that doesn't measure true RMS still has some sort of precision rectifier (likely op amp) circuit to generate the DC signal for the meter.
The frequency response of that circuit would certainly be a significant factor limiting the meter's frequency response.
 
Top