Total confusion over RF calibration voltage on power meter.

Thread Starter

ronsoy2

Joined Sep 25, 2013
71
Two RF power meters are on hand, one is HP438A with a HP8481D sensor, the second is HP5347A with a HP8484A sensor. Using various attenuators I am not able to make any sense of the readings I am getting from the built-in 50mhz test source in each instrument. The readings are low by approximately 30%. (unit is reading .7 microwatt when it should be reading 1 microwatt.) Making readings with the two instruments and test signal outputs gives the same results on both instruments. They both read approximately the same power level and it is about 30% low. So either I have two different instruments with the same problem or I have a concept problem.
Using a scope on the reference output (with a 50 ohm termination) I read 1 volt peak to peak. The power chart shows the voltage should be .622 volts peak to peak for 1 milliwatt, in 50 ohms. If anything, the reading on the sensor should be high, I would think! Both instruments have the same 1 volt peak to peak output on the reference, and read the same value from the sensor, so either both failed the same or something else is wrong. The scope is a HP54622D (very nice scope) and tested on a voltage standard is within one small division. Anyone have experience with these types of power meters?
 

Wuerstchenhund

Joined Aug 31, 2017
189
It's been a while since I last used a HP 438A or 5347A but I'm pretty sure the correct calibration voltage is 1Vpp.

Another thing is that these old power meters were incapable to adjust for the power head deviation, i.e. your power head will come with a table which shows how far the measured value deviates from the real value for certain frequencies. Modern power meters can read this table electronically from the power head and apply internal correction factors but not the HP 438A (and I don't recall the 5347A being able to do that either).

Also, have the power heads been calibrated recently? 30% off sounds like a lot and suggests the heads could be damaged.
 

nsaspook

Joined Aug 27, 2009
8,177

Thread Starter

ronsoy2

Joined Sep 25, 2013
71
The question is WHAT IS THE CORRECT VOLTAGE P-P THAT SHOULD BE MEASURED ON THE CALIBRATOR OUTPUT FOR 1 MILLIWATT IN 50 OHMS. This is where the confusion is. From the DB table it looks like .22 volts RMS or .63 volts p-p. (approximately) The calibrator output on the two DIFFERENT instruments is 1 volt p-p. This doesn't calculate to 1 milliwatt in 50 ohms. (the label on the instrument says 1 milliwatt into 50 ohms for the calibrator output) The voltage is being measured with a proper 50 ohm termination using a X10 probe. I don't see how two instruments of this quality could fail in the same way. They both have the 1 volt output. There seems to be a concept error somehow.
 

Yaakov

Joined Jan 27, 2019
2,480
The question is WHAT IS THE CORRECT VOLTAGE P-P THAT SHOULD BE MEASURED ON THE CALIBRATOR OUTPUT FOR 1 MILLIWATT IN 50 OHMS. This is where the confusion is. From the DB table it looks like .22 volts RMS or .63 volts p-p. (approximately) The calibrator output on the two DIFFERENT instruments is 1 volt p-p. This doesn't calculate to 1 milliwatt in 50 ohms. (the label on the instrument says 1 milliwatt into 50 ohms for the calibrator output) The voltage is being measured with a proper 50 ohm termination using a X10 probe. I don't see how two instruments of this quality could fail in the same way. They both have the 1 volt output. There seems to be a concept error somehow.
The manual says the sensors should;ld be calibrated to read according to a calibration factor which usually printed on the sensor. Did you see that section?
 
Top