I am trying to derive a scaling factor for an analog voltmeter for the purpose of measuring the secondary voltage of an electronic halogen transformer.
http://www.ledbenchmark.com/faq/Transformers-Output-and-Compatibility.html
The output voltage of these things is a "high frequency" square wave (30-100kHz) with a 100 Hz (line frequency) envelope (see link). Digital multimeters often have problems measuring this unless you opt for high bandwidth, True-RMS models with a considerable price tag, so why not try an analog meter.
The RMS specs of the DUT (Osram HTM70) is 11.5V. Tried 3 different types including 2 wideband audio VOMs one with a BW of 1MHz and all 3 show a 10V reading. Analog meters measure the average value and apply a scaling factor of Pi/2sqrt(2) to adjust the reading to the RMS value of a sine wave. Since the average value of the above waveform is the same, at least to my understanding, I expected the same reading. This is obviously not the case.
Where's the flaw in my logic?
http://www.ledbenchmark.com/faq/Transformers-Output-and-Compatibility.html
The output voltage of these things is a "high frequency" square wave (30-100kHz) with a 100 Hz (line frequency) envelope (see link). Digital multimeters often have problems measuring this unless you opt for high bandwidth, True-RMS models with a considerable price tag, so why not try an analog meter.
The RMS specs of the DUT (Osram HTM70) is 11.5V. Tried 3 different types including 2 wideband audio VOMs one with a BW of 1MHz and all 3 show a 10V reading. Analog meters measure the average value and apply a scaling factor of Pi/2sqrt(2) to adjust the reading to the RMS value of a sine wave. Since the average value of the above waveform is the same, at least to my understanding, I expected the same reading. This is obviously not the case.
Where's the flaw in my logic?