12VAC to 5 VAC transformer question

Thread Starter

Docdaa008

Joined Jul 12, 2010
4
I have a question regarding the process of dropping a 12V AC signal down to a 5V AC signal, with respect to the phase of the signal.

To help clarify this here is an example:

Say, I have a 12V AC grid voltage, and I wanted to match an input 12V signal to that grid voltage in frequency and phase. I would like to drop down the 12V grid reference signal to a 5 V AC signal, so that I can input it to a DSP chip's A/D input. I would then use a Phase Lock Loop and an H-Bridge to match the incoming 12 V signal to the reference signal.


I have one major worry at this point. Would dropping the reference grid voltage down from 12V to 5V using a simple transformer effect the phase of the input signal going into the DSP chip?
 

retched

Joined Dec 5, 2009
5,207
No, the transformer shouldn't effect the phase. If you use a center tap (Or any other) you will have variances, but if you use the start and end taps from the transformer, then what you see on the primary is what you see on the secondary (less amplitude)
 

Thread Starter

Docdaa008

Joined Jul 12, 2010
4
Thanks for the help. I was also thinking about using a simple voltage divider instead, since a 12V to 5V drop shouldn't be too wasteful.

Any reason to use one over the other?
 

retched

Joined Dec 5, 2009
5,207
Nope, you can use a simple voltage divider to get the 6v then use a linear regulator. You may need an LDO, but that depends.

Do you need a regulated 5v?

If not, the divider and a diode will get you pretty close.

What current are you working with?
 

Thread Starter

Docdaa008

Joined Jul 12, 2010
4
I won't need a regulated voltage. But, accuracy and cost are definitely making me lean towards using a voltage divider.
 
Top