Measuring duty cycle of proximity sensor using multimeter

Thread Starter


Joined Mar 13, 2022
Hello. We have an application that uses a prox sensor to read the teeth of a drive gear, with the intent of triggering an alarm if the gear stops turning. The signal train is read by a microcontroller in normal operation,which in turn reports to an HMI running on a linux box. The engineers have specified the correct positioning of the prox not by phsical distance but by duty cycle. It is believed possible to measure the duty cycle of this sensor using a multimeter with a Hz/% mode, which would be desirable for setting these sensors earlier in the assembly process. However, after a day or so of trying, none of us can get a plausible reading either on our Uni-T UT139C or a Fluke 179. I guess I could use an oscilloscope for this, but we don't have one and aren't going to get one. Any advice how to set this up would be much appreciated.

Kind Regards,

Nathan in Athens


Joined Jun 19, 2012
Ok you just want to be able to detect a specific duty cycle?

You can integrate the signal to near DC with an RC network, then measure the resulting voltage with a DC meter.
50 % duty == 50% of the ON voltage.


Joined Jul 29, 2018
1) Get the sensor producing both high and low output voltages, use a pull-up/-down resistor if needed. 2) Determine the frequency (in Hz) that results if you spin the gear a fast as the setup allows. 3) Add a series resistor about 1/20th the input resistance of the meter. 4) Add a capacitor directly across the meter such that it and the series resistor form a lowpass filter with a frequency at no more than 1/4th the frequency determined in #2. Use the equation C=1/(2*3.14*R*f) to get the minimum capacitor value. A slightly bigger cap would be OK, it isn't critical. 5) Spin the gear as fast as you can. With the meter reading "DC Volts" a value of about 47% of the sensor supply voltage should be close to a 50% duty cycle.