I am trying to learn about measuring motor starting current, using a couple of different Fluke digital multimeters and a current clamp. I am a general contractor who builds houses (Seattle, WA, USA area) and have very little electronics knowledge, so I'm hoping that someone will take pity on me and explain the basics.
The motors in question are single phase AC motors, usually 120V but sometimes 240V, in the 1HP to 3HP range. They are the types used to run typical electric tools like air compressors, table saws, etc. This all started when I tried to figure out why a particular air compressor was causing nuisance trips of circuit breakers, when another similar compressor was not.
The meters I have are models 179 and 87-V. The current clamp is a model i200. I use a line splitter at the wall plug to attach the clamp. Both of the meters have a "MIN MAX" mode, in which the minimum and maximum reading are captured and displayed. The difference, as I understand it, is that the 87-V has a faster response time (a shorter internal clock cycle?) and is going to be more accurate when trying to capture a short-term value. However, I have found that the 87-V consistently gives smaller motor start readings, which I find confusing. Example: 2HP 120V single phase air compressor, the running current is ~15A according to both meters, but the 179 says the start is 43A and the 87-V says the start is 35A. From what little reading I've done, it appears that start current is 4x running current, or even higher.
Bottom line, I have little real understanding of how these meters work internally, but I believe I am operating them in the intended manner. I'm pretty sure an accurate reading of this type should be possible with this type of equipment, but I am aware that an oscilloscope or power analyzer could also be used (tools like these would be wasted on me).
Any guidance would be greatly appreciated.
The motors in question are single phase AC motors, usually 120V but sometimes 240V, in the 1HP to 3HP range. They are the types used to run typical electric tools like air compressors, table saws, etc. This all started when I tried to figure out why a particular air compressor was causing nuisance trips of circuit breakers, when another similar compressor was not.
The meters I have are models 179 and 87-V. The current clamp is a model i200. I use a line splitter at the wall plug to attach the clamp. Both of the meters have a "MIN MAX" mode, in which the minimum and maximum reading are captured and displayed. The difference, as I understand it, is that the 87-V has a faster response time (a shorter internal clock cycle?) and is going to be more accurate when trying to capture a short-term value. However, I have found that the 87-V consistently gives smaller motor start readings, which I find confusing. Example: 2HP 120V single phase air compressor, the running current is ~15A according to both meters, but the 179 says the start is 43A and the 87-V says the start is 35A. From what little reading I've done, it appears that start current is 4x running current, or even higher.
Bottom line, I have little real understanding of how these meters work internally, but I believe I am operating them in the intended manner. I'm pretty sure an accurate reading of this type should be possible with this type of equipment, but I am aware that an oscilloscope or power analyzer could also be used (tools like these would be wasted on me).
Any guidance would be greatly appreciated.