AC current sensing

Thread Starter

tstevens01

Joined May 18, 2005
2
What a great website! I have not been doing much work with electronics for many years (wife, kids, job etc) but now I have a little project that I would appreciate some help with - if anyone has time!

I want to measure the current drawn by a 240v AC device where the range could be between 30mA and 13Amps, or even zero (device switched off).

Somehow, perhaps with a resistor, I need to turn the current into a voltage between 0 and say 5volts so that an A/D converter can sample the voltage.

I have tried playing around with current transformers but the output voltage is either too small or too large. And to make things harder the CT picks up so musch noise that detecting small current changes is almost impossible!!

Where do I go from here? Is the primary current range too big or is there another way of measuring current that doesn't involve a CT?

I would relaly appreciate any suggestions before I go bonkers!

thanks in advance.

Terry
 

Thread Starter

tstevens01

Joined May 18, 2005
2
Thanks n9xv, that is a very useful website. However the current transducers seem to be quite expensive and bulky.

I had in my mind that there might be a way of taking the large voltage range produced by a cheap current transformer and put it through some circuit that would 'compress' the overall range of the voltage output.

Also, in my experiments so far it seems like a 50Hz CT acts as a perfect ariel for all the mains that's around it. So without a current on the primary the secondary still produces about 1 volt peak to peak. This then means that a small current change in the primary, say 0 to 30mA, makes a voltage change on the output of the CT difficult to detect .

Does this make sense or am I trying to do something too impossible here?

Thanks
 

beenthere

Joined Apr 20, 2004
15,819
Hi,

Your project may end up being more trouble than it is worth. Although I am surprized to hear that your CT performs so poorly. The low current end would be sort of difficult, though.

If you want to sample the current as a proportional voltage, you will have to break the supply line and insert a resistor. I did that for years using a .1 (1/10th ohm) 50 watt resistor. It was easy because I was measuring pulsating DC, and one end of the resistor could go to circuit ground - the drop across the resistor never exceeded 2 volts.

In your case, the resistor will have a big common-mode AC voltage imposed. About the only safe way (for your measuring circuit) will be to use a transformer to isolate the mains voltage. Something like an old filament transformer might do, with the voltage across the resistor driving one end and the center tap of the secondary. You'll have to figure out how to calibrate the reading, too.

Rather than try to use an A to D directly on the isolated and proportional voltage, rectify it and measure the DC resultant. Otherwise, you have to synchronize the A to D to consistently read some level of the AC waveform, like right at the peak. Not a major problem, but a diode and small capacitor are cheaper.

Trying to resolve milliamps out of a 13 amp circuit is always going to be interesting.

Good luck.
 

thingmaker3

Joined May 16, 2005
5,083
Originally posted by tstevens01@May 18 2005, 05:07 PM
I want to measure the current drawn by a 240v AC device where the range could be between 30mA and 13Amps, or even zero (device switched off).

Somehow, perhaps with a resistor, I need to turn the current into a voltage between 0 and say 5volts so that an A/D converter can sample the voltage.

Terry
[post=7788]Quoted post[/post]​
E-Bay has some 0.06 ohm 35 Watt resistors. Voltage across would be 1.8mV at 30mA and 780mV at 13A. A variable gain instrumentation amp could easily boost that to 5V for 13A and 8.3mV for 30mA. The ADC would need seven or more bits for decent resolution across your operating range.
 
Top