Thread Starter

uaff

Joined Jan 29, 2020
1
Hello, everyone!


I have a problem with approaching a university project, which aim is to detect the phase drifts (difference, deviation) between a mains supply voltage and either an internal reference and externally input reference signal. The system should be micro-controller based (MSP430).
So I have 4 questions:

1) Any suggestions on how to approach the project and what equipment will be needed?
2) Could XOR gate be used for the two signals to be combined and receive an output signal, which width will be equal to the phase difference?
3) Convert the signal using a Schmitt trigger?
4) The phase drifts within the 50Hz frequency range will be too small to detect, so a calibration and verification feature is needed. Any thoughts on that?

I would appreciate any help! Thank you!
 

DickCappels

Joined Aug 21, 2008
10,169
Either and XOR or (preferably) an RS flip flop or D-Latch used as an RS are a good way to measure phase differences. You can also use a transmission gate or diode bridge to sample the sine wave and measure the sampled voltage.

Yes, definitely condition the sine wave to drive digital logic. A Schmitt trigger is a good idea but be aware that they hysteresis might cause unacceptable phase shift.

More information about what you want to do is necessary before "The phase drifts within the 50Hz frequency range will be too small to detect, so a calibration and verification feature is needed. Any thoughts on that?" can be answered.
 
Top