Hello
I am new using spectrum analyzers and have recently been doing a measurement to verify a 50 ohm input impedance (which it is supposed to be). From my measurement I conclude that it is higher, so i wonder if anyone can see if I am doing any mistake.
The test setup is as can be seen in the attached file.
I am expecting to get an amplitude of 315mV@30kHz (50% of the signal generator amplitude) on the SA input. This is, however, not what I am getting. I get a 420mV amplitude reading on the oscilloscope (resulting in a 100ohm input impedance on the SA) but the SA shows 315mV. So it seems that the SA is calibrated to compensate for the deviation in input impedance. Could this be the case? Or i am doing something wrong here?
/Magnus
I am new using spectrum analyzers and have recently been doing a measurement to verify a 50 ohm input impedance (which it is supposed to be). From my measurement I conclude that it is higher, so i wonder if anyone can see if I am doing any mistake.
The test setup is as can be seen in the attached file.
I am expecting to get an amplitude of 315mV@30kHz (50% of the signal generator amplitude) on the SA input. This is, however, not what I am getting. I get a 420mV amplitude reading on the oscilloscope (resulting in a 100ohm input impedance on the SA) but the SA shows 315mV. So it seems that the SA is calibrated to compensate for the deviation in input impedance. Could this be the case? Or i am doing something wrong here?
/Magnus
Attachments
-
22.1 KB Views: 38