Voltage Divider for Sensing Mains Voltage?

KMoffett

Joined Dec 19, 2007
2,918
I think it was a typing error when calculating. I'm typing one-handed with my non-dominant hand lately because of surgery. :(

Ken
 

Thread Starter

jcrollman

Joined May 21, 2008
19
It would be the best option in my opinion.
Would adding the 1:1 transformer eliminate the issue of the DAQ module input impedance introducing a parallel error into the system? It would be isolated, so there shouldn't be any current draw into it? Is that correct statement?
 

BillO

Joined Nov 24, 2008
999
The 1MΩ input to the DAC is not going to be much of a load across 20KΩ. In any case, don't fret this. Once it is built, just calibrate it against a good, accurate meter or oscilloscope.

At this point I need to ask what response time is required on this. I got the impression you want to monitor the voltage and kick off some event if it drops below a certain value. Does this have to be done within ms, second, a minute, an hour?
 

Thread Starter

jcrollman

Joined May 21, 2008
19
At this point I need to ask what response time is required on this. I got the impression you want to monitor the voltage and kick off some event if it drops below a certain value. Does this have to be done within ms, second, a minute, an hour?
No event needs to be executed. Essentially this is for a power quality test. From my understanding, if the supply stays within a certain tolerance for a certain length of time, it passes. If not, it fails. Rubber stamp sort of thing for inspection purposes.

The 1MΩ input to the DAC is not going to be much of a load across 20KΩ. In any case, don't fret this. Once it is built, just calibrate it against a good, accurate meter or oscilloscope.
I understand in regards to the input resistance. My more worrisome concern is what would happen to the circuit if I put a 1:1 transformer in it (essentially in parallel with the 20KOhm resistor), because I don't have a good fundamental understanding of it's properties. Ideally, I think of it as getting the same voltage out of it as I put in, with the added bonus of isolation. My gut tells me it isn't that simple, and that parallel impedance errors might come into play here too, if not other things.
 

BillO

Joined Nov 24, 2008
999
This all seems like tremendous overkill then.

In any case, the transformer should do nothing to the impedance. With an ideal transformer it would still be equivalent to havng the 1MΩ load of the DAC across the 20KΩ of the divider. Modern transformers are pretty efficient. You are thinking of it the right way.

Again, do not worry about errors. This will all be taken care of with proper calibration.

Example, but 600v on the input, the DAC measures 51v, then your calibration constant is 600/51.

To check linearity, put 500v on it. Then the DAC should then read 500*51/600=42.5

Etc...
 

Jaguarjoe

Joined Apr 7, 2010
767
Here are 600v primary/24 volt secondary transformers:

http://www.hammondmfg.com/PHs.htm

They are 41 bux each from Newark.

Wiring two of them with their primaries in series will give you a 1200 volt primary to put across your 1000 volt feed. Wiring the secondaries in parallel will give you a 20 volt secondary (be careful of the phasing, you'll get 0 if wrong).
Put a 10k ohm 10 turn pot across the secondary and feed the wiper to the DAQ. Use that pot to calibrate the system.
 
Top