Fiber Optics Synchronization tunable delay

Thread Starter

cl10Greg

Joined Jan 28, 2010
67
Hello Everyone,

I am working on a homework problem that I have attached below that is based out of Optical Networks: A practical approach 3rd edition. So the outcome needs to be an algorithm to represent the tunable delay.

So the way I am picturing it is that there is two signals that both have period T that are out of synchronization by z. So I need to find a formula to represent z to align the two signals. The 2x2 switch can either be cross state (c = 0, adds delay) or a bar state (c=1, no delay). Each delay is in steps of T/2^k-1.

So from a logic point of view I am picturing a feedback system to tune in the delay.
  • Find the starting value of z (signal 1 - signal 2)
  • signal goes through crossed 2x2 switch (c1 = 0)
  • Find new z after delay (signal 1 - signal 2)
  • If z > 0 +- tolerance
  • ci = 0
  • calculate new z
  • Repeat until z is within a tolerance of 0 so the signals are now aligned
  • Count the amount of repetitions to determine the number of stages to make this happen.
This is course tuning (I don't need fine) but I need a way or some guidance on if my thinking is correct and how to correlate that to a algorithm. Also, what is the point of the bar state if I am most likely never going to use a pass through of the switch? I guess if the delay went past the synchronization point but then I would have to tune the other signal but really we're only manipulating the one signal to match the reference signal. Any thoughts or help is appreciated.
 

Attachments

Top