Hello,
We are using ACS 800 Drive to Run a Pump.
The Other Day, I was cross checking the Current by a Clampmeter as indicated on the Drive.
I am using the Drive to Run a Motor 0f 55 kW/1450 rpm/415 V/ 95 A, which is about say 200 metres from the Drive and using 35 sq mm Flexible Copper Cable from Drive to Motor
Now, the Current indicated on drive at full load is 95 Amps. On Checking with Clampmeter, just at the Output from the Drive Terminal is 105 Amperes, and when I checked near the Motor, ClampMeter Shows 175 Amp.
Why is there so much of difference.
I can understand Voltage Drop with increase in length of cable. But such large deviation of current from Source Point to Destination is not understood by me.
Also, what puzzles me is as I am using 35 sq mm Cable, the Cable Insulation should practically get very heated and start to degrade at such high Current at the end point of motor , but the tempaerature is just Ok at around 30 degrees.
Am I missing out some basic laws of electricity
We are using ACS 800 Drive to Run a Pump.
The Other Day, I was cross checking the Current by a Clampmeter as indicated on the Drive.
I am using the Drive to Run a Motor 0f 55 kW/1450 rpm/415 V/ 95 A, which is about say 200 metres from the Drive and using 35 sq mm Flexible Copper Cable from Drive to Motor
Now, the Current indicated on drive at full load is 95 Amps. On Checking with Clampmeter, just at the Output from the Drive Terminal is 105 Amperes, and when I checked near the Motor, ClampMeter Shows 175 Amp.
Why is there so much of difference.
I can understand Voltage Drop with increase in length of cable. But such large deviation of current from Source Point to Destination is not understood by me.
Also, what puzzles me is as I am using 35 sq mm Cable, the Cable Insulation should practically get very heated and start to degrade at such high Current at the end point of motor , but the tempaerature is just Ok at around 30 degrees.
Am I missing out some basic laws of electricity