Simply Measuring AC Power

Thread Starter

BkkChris

Joined Jun 3, 2008
5
Hi,
I have a multimeter with an AC Current mode. I know that due to phase shift it may not be accurate to
just measure AC current to get power use...
But how does this typically work out in small device home situations?

For example, I'd like to measure power use of some of my things like my notebook, fan and router and lights etc.
I don't have one of these SaveAWatt meters and cannot get one easily here.
So I'd like to use my AC current meter to check some of these devices usage. Is there a power factor to multiply by that typically gives close enough power usage values? Or is it just 220V * AC Current that gives the closest estimate?

I guess I'm asking if there is somewhere a table of power factors for common household items?
Of interest is fluorescent lights, fans and switching supplies for computer stuff.

Thanks for any input here.
 
Last edited:

beenthere

Joined Apr 20, 2004
15,819
One rarely accounts for power factor when measuring the power usage of a laptop. Due to the variability of power usage by the processor and drives, getting that particular measurement will require monitoring over some period of time.

Be extremely careful if you feel you really must take these current measurements. All require breaking the supply line in order to place the ammeter in circuit. That exposes lethal voltages, and also a repair of the power cord. Neither is really worthwhile, as the power ratings are displayed on the laptop/appliance.
 

mik3

Joined Feb 4, 2008
4,843
You dont really need to measure the power consumption of each of your devices because on each device is written the power it consumes. But if you still want to measure take care of the mains voltage as been there told you, it is dangerous so if you dont know what are you doing is better not to do it.
 

silencer

Joined Jun 3, 2008
9
as stated, i think your best bet is to budget your load using the manufacturer's ratings.

if you do decide to measure the current yourself, there are a few obstacles:

first, as mentioned before, a regular multimeter would require you to break the conductors, something that should never be done. for any household current measurement, a current clamp should be used. you may be able to buy one that will plug right into your DMM and read off the current adjusted with an appropriate correction factor.

second, as the hot and ground connections are often bundled together as one cord, you would not be able to place your clamp around only the device's hot wire. the solution would be to place your clamp around the appropriate branch circuit wire in your electrical panel, provided that the device you desire to measure is the only one drawing power.

third, to make an accurate current measurement, you would need the true-rms value. unless you are using a high-quality multimeter that specifically advertises true-rms readings, you can assume that the multimeter will simply "calculate" the rms current value by detecting the peak value and multiplying by 0.707. thus, it assumes a perfectly sinusoidal current waveform, which could result in a pretty inaccurate reading, given that many power supplies these days are switched-mode.

hope this helps.

jon
 

Thread Starter

BkkChris

Joined Jun 3, 2008
5
Thank you for the replies.

I have worked with meters and circuits long enough that I'm not worried about putting my meter in circuit. I guess it's good to warn people. Certainly there is no need to be breaking conductors. With a little thought it is very easy to modify an old power bar to have test leads inserted such that taking measurements only means plugging into this special outlet.

I fully realize that appliances have ratings on them. Sure that's great if you want to know the maximum they could use. But I am looking for actual values in use. For example, my fan has three speeds and I would like to compare the use on each of the speeds. Likewise when using my notebook I'm interested in variations when performing different tasks. If I had just wanted to read the label I wouldn't have posted here. Not that I don't appreciate peoples replies but it would be more helpful to have responses to what I asked about.

Silencer,
You have come close to the crux of what I was asking. That is, on switching supplies the average active power used is different than the apparent power measured. What I am looking for is estimates as to what the difference actually is. No one seems to have any more specifics towards this question but if someone out there has seen any information on this, that is what I seek. Either some articles or reference tables describing how measurements vary from actual power usage - I would think somewhere this has been looked at.

Thanks,
Chris :)
 

silencer

Joined Jun 3, 2008
9
Ok, that last post cleared up your objectives for me a little bit.

I'm a little confused when you say "the average active power used is different than the apparent power measured". By 'active power' are you referring to real power, as compared to apparent power?

Either some articles or reference tables describing how measurements vary from actual power usage - I would think somewhere this has been looked at.
I may have some materials to look at--I'll check. I'm an engineer with an electric company and we make these types of measurements all the time, but with $7000 recording meters, so its never done by hand.

I don't think it will be as simple as measuring the current waveform amplitude, throwing in a power factor, and cranking the numbers. For anything other than motors, lights, and other simple machines that draw a relatively sinusoidal current waveform, you would need to make some fairly accurate assumptions about the waveform, which may be feasible for a simple rectifying power supply, but probably not for any type of switcher (although you could probably grind through it).

As I mentioned before, I think your best bet is to obtain a good quality meter that will read true-RMS. Instead of just using a correction factor like your 10 dollar radio-shack special, it actually does the integration of the waveform. Then simply multiply the true-RMS current by RMS voltage. That would probably get you very close. If you had access to one (they're expensive), you could use a meter that measures the voltage and current simultaneously and does the calculation for you. That would probably be most accurate.
 

Ratch

Joined Mar 20, 2007
1,070
BkkChris,

I have a multimeter with an AC Current mode. I know that due to phase shift it may not be accurate to
just measure AC current to get power use...
But how does this typically work out in small device home situations?
You are correct to worry about phase shift. That is a common mistake many folks make when they try to do power measurements. Multiplying the RMS voltage with the RMS current will not produce an accurate result if phase difference is present. And if you are using a powerful motor like a router, you can expect some phase shift. I hate to say it, but what you are trying to do is best done with a dual trace scope with a current probe and a voltage probe. The current probe can be a clamp-on so you will not have to break the circuit. You will need to multiply the two iinstantaneous values of each point on the current and voltage curve and plot each product. Then find the average for that curve. That is best done with a computing scope.

You asked about the definitions of power. It is usually plotted as a triangle. The horizontal line is the real power, the vertical line is the reactive power, and the hypotenuse is the apparent power. Apparent power is also the product of the RMS value of V and RMS value of I. The power factor is the cosine of the triangle. Ratch
 
Top