Reliable AMP meter seems to give very wrong result for door bell receiver.

Thread Starter

larrye781

Joined Apr 27, 2022
3
I have an amp meter in my multi-tester. I know it to be accurate. I recently bought a wireless door bell set. Plug in receiver, battery operated transmitter. The receiver says on the label that it consumes 2 watts. However when I measure it with my Amp meter it says .075 amps (at 220V) which is abut 17 watts. The two watts is probably accurate because it does not get hot. By contrast a 17 watt light bulb would get quite hot after only a few minutes. What is going on? Does anyone have any idea?
 

Ian0

Joined Aug 7, 2020
9,667
If you measure current and multiply it by voltage you get VA.
Power = current x voltage x power factor.
 

Thread Starter

larrye781

Joined Apr 27, 2022
3
If you measure current and multiply it by voltage you get VA.
Power = current x voltage x power factor.
Is that supposed answer my question? Usually if I measure and multiply it almost exactly agrees with the stated rating of watts. For example my 11 watt light bulb at 220 volts showed exactly .05 amps in my tester. In this case with the door bell I am off by a factor of 900 percent. Is that accounted for by the difference between VA and watts? I think not.
 

Ian0

Joined Aug 7, 2020
9,667
Is that supposed answer my question?. Is that accounted for by the difference between VA and watts?
Yes.

I suspect that most of the current you are measuring is going through a class-x capacitor on the mains input, which is there for the purposes of EMC filtering. Current through a capacitor has a power factor of zero.
 
Last edited:

Ya’akov

Joined Jan 27, 2019
9,069
What @Ian0 is getting at is that you are probably measuring apparent power rather than real power because the current and voltage waveforms are out of phase. You have to use power factor correction to determine the real power.
 

Thread Starter

larrye781

Joined Apr 27, 2022
3
What @Ian0 is getting at is that you are probably measuring apparent power rather than real power because the current and voltage waveforms are out of phase. You have to use power factor correction to determine the real power.
Ok, thanks, i think I understand. I am not an electrical engineer i was a software engineer. Wave forms out of phase, hmmm, that makes sense, and I think i finally get the idea of power factor vs wattage. It has to do with the wave forms being out of phase. If they are completely out of phase then that would be a power factor correction value of 0. A capacitor stores and releases current which would account for the out of phaseness of the amps and voltage. Thanks!
 
Top