BG7TBL 10MHz OCXO Frequency Standard w/ 8 port distribution amplifier feeding Siglent SDS2104X Plus scope

Thread Starter

SamR

Joined Mar 19, 2019
5,470
Finally, after a few months of bad deals with AliX, I received a latest model distribution amplifier with the crystal source oven built into it. They are built with salvaged OCXOs usually pulled from cell tower service. Peeked inside and torqued down the SMA coax fitting from the oven and looks well-made other than a less than finger tight SMA. Plugged it in last night to allow it to warm up and stabilize. Placed 50Ω BNC terminators on all unused ports and one open port to scope and let it collect data. Mean Frequency to within 1Hz. but not at all stable. Take a peek at the min/max freq. in the measured table on the graph. Quite a bit of far more than just jitter. Next step is to feed it with a GPS derived freq. input. I wanted to see just how stable it is before investing another couple hundred USD in a stable freq. GPS source but looks like I will have to if I don't want the instruments and radios that it will source going nuts trying to synch with it. I also tried another port with another BNC RG316 coax with no better results.
1775237221912.png
 

tautech

Joined Oct 8, 2019
496
Pro tips
Use the blue Print button instead of the Save/Recall menu to save screenshots directly to USB.
Then you can keep a relevant menu visible should you need to show any settings used.

Jitter can be measured with cursors when the H Pos is shifted several screen widths to the left and off the display where a time offset value will be seen in the timebase tab and the waveform displayed will show accumulated jitter although a tiny portion will be system jitter.
 
I think that the Min/Max stuff is just an intermediate value of the measurement process. Your instrument is "showing its work."

Looking at your display:
  • There are 200 samples per 10MHz cycle. (The sample rate is "2.00GSa/s") This is 100 samples per horizontal division.
  • The trigger takes place in the center of the screen (the downward orange triangle at the top of the screen). Your trigger condition (rising through just about 0) is met a total of four times.
  • The displayed frequency differences are 1 part in 800. The three frequency numbers that are displayed are 800, 799, and 801 times (10/800)MHz, for "Value", "Min", and "Max", respectively.
  • The "Count" is 6600, which I just assume is the number of screens of samples used in computing the frequency. (Really do not know what it is.)

My guess, and this is a guess, is that you are computing the frequency from four cycles (the 4 per screen with the triggering) at 200 samples per individual cycle. This is nominally 800 samples, but some are 1 lesser or greater due to phase relationships. Over the course of 6600 screens, 10MHz would have a duration of (6600*800)=5280000 samples, whose reciprocal gives a resolution of 1.9Hz. I think that, in your measurement, you got an imbalance of -1, or (5280000-1)=5279999 total samples while recording (6600*4)=26400 cycles. Math gives you ((26400 cycles)/(5279999 samples))*(2G samples/second)=10000001.8939 /second.

Min/Max is governed by the sample rate and horizontal sweep rate. If you have available another sample rate other than the 2GSa/s that you are using, you will see different Min/Max, but the same Mean.

Anyway, one vote for "aliased digitization artifact that is not in your signal" from some guy on the Internet.
 
Last edited:
Top