I have a question regarding the process of dropping a 12V AC signal down to a 5V AC signal, with respect to the phase of the signal.
To help clarify this here is an example:
Say, I have a 12V AC grid voltage, and I wanted to match an input 12V signal to that grid voltage in frequency and phase. I would like to drop down the 12V grid reference signal to a 5 V AC signal, so that I can input it to a DSP chip's A/D input. I would then use a Phase Lock Loop and an H-Bridge to match the incoming 12 V signal to the reference signal.
I have one major worry at this point. Would dropping the reference grid voltage down from 12V to 5V using a simple transformer effect the phase of the input signal going into the DSP chip?
To help clarify this here is an example:
Say, I have a 12V AC grid voltage, and I wanted to match an input 12V signal to that grid voltage in frequency and phase. I would like to drop down the 12V grid reference signal to a 5 V AC signal, so that I can input it to a DSP chip's A/D input. I would then use a Phase Lock Loop and an H-Bridge to match the incoming 12 V signal to the reference signal.
I have one major worry at this point. Would dropping the reference grid voltage down from 12V to 5V using a simple transformer effect the phase of the input signal going into the DSP chip?