Why is high current used to measure increasingly low resistances?

Status
Not open for further replies.

Thread Starter

smooth_jamie

Joined Jan 4, 2017
107
Hi All,

For my work it is becoming more common for me to measure lower, and lower resistances. Megger has a really good manual on how to measure down to micro-ohms (for example to determine the quality of bus bar joints etc). I have found the 4-wire measurement to be very repeatable, and useful of course, and I understand that to measure very low resistances the appropriate test current should be selected for the range you want to measure. Megger indicates the following currents and their corresponding ranges:

  1. Ohm range, 1 Ampere
  2. Milli-ohm range, 10 Ampere
  3. Micro-ohm range, 100 Ampere

I am often measuring connections that are around 1 m.Ohm to 10 m.Ohm so I always use 10A to measure. My question is though, why do we need increasingly high current to measure increasingly low resistances? I am guessing higher current's give better resolution for smaller resistances but this is not explained very well on the internet. Can anyone tell me exactly why resolution has anything to do with current?
 

andrewmm

Joined Feb 25, 2011
326
Its very unlikely your actually using 100 amps to measure the resistance,
its "just" the switch settings on the meter.

Think about the size cable you would need to connect the resistor with to carry 100 A
 

Papabravo

Joined Feb 24, 2006
13,741
I'll give this a shot. It is not so much the resolution as the dynamic range of the measurement tool. There are at least two conventional ways to measure current:
  1. Break the circuit, insert the ammeter in series, reconnect the circuit.
  2. Run a test current through the load, measure the voltage drop.
In many applications, alternative #1 has certain manifest shortcomings. That means we need to understand the implications of alternative #2
Measuring small voltages can be tricky since they can drop into the level of the surrounding thermal noise. Lets run some numbers with test currents on 1 milliohm

1 ma through 1 mΩ produces a voltage drop of 1 μV --> I don't have a voltmeter that can measure microvolts -- do you?
10A through 1 mΩ produces a voltage drop of 10 mV --> I can barely measure that with my Fluke 77 multimeter -- do you see the problem?
 

OBW0549

Joined Mar 2, 2015
3,408
1 ma through 1 mΩ produces a voltage drop of 1 μV --> I don't have a voltmeter that can measure microvolts -- do you?
That's an important point. And even if he had a voltmeter that could measure microvolts, he'd have to exercise extreme care to keep his measurements from being hopelessly corrupted by thermoelectric potentials. Every joint between dissimilar metals is a thermocouple and can easily generate dozens of microvolts, or more, for every degree Celsius of temperature difference.
 

Thread Starter

smooth_jamie

Joined Jan 4, 2017
107
Ohm’s Law.
I = V / R
V = I x R
R = V / I

You are detecting voltage, not current
@MrChips you've missed the point. I wouldn't have a job as an electronics engineer how a shunt resistor works to measure current. Please read my question again carefully and post a more appropriate response.

Its very unlikely your actually using 100 amps to measure the resistance,
its "just" the switch settings on the meter.

Think about the size cable you would need to connect the resistor with to carry 100 A
@andrewmm Please read my question again. I am not using 100 A. I am forcing a known current through a 'resistor' and measuring the voltage drop across it. For very low resistances you should be using high currents. My question is why.

I'll give this a shot. It is not so much the resolution as the dynamic range of the measurement tool. There are at least two conventional ways to measure current:
  1. Break the circuit, insert the ammeter in series, reconnect the circuit.
  2. Run a test current through the load, measure the voltage drop.
In many applications, alternative #1 has certain manifest shortcomings. That means we need to understand the implications of alternative #2
Measuring small voltages can be tricky since they can drop into the level of the surrounding thermal noise. Lets run some numbers with test currents on 1 milliohm

1 ma through 1 mΩ produces a voltage drop of 1 μV --> I don't have a voltmeter that can measure microvolts -- do you?
10A through 1 mΩ produces a voltage drop of 10 mV --> I can barely measure that with my Fluke 77 multimeter -- do you see the problem?
@Papabravo We are going off piste with this subject. I work in a lab and I have access to a decent bench DMM so let's assume the quality of the instrumentation is not a problem. I can measure 3 m.Ohm no problem at 10A. What I want to know is say lets attempt to measure something that is lower (maybe micro Ohms), Megger suggests we should use a test current of 100 A. Why do we need increasing currents to do the same job when ohms law says that any current will do?
 

Thread Starter

smooth_jamie

Joined Jan 4, 2017
107
That's an important point. And even if he had a voltmeter that could measure microvolts, he'd have to exercise extreme care to keep his measurements from being hopelessly corrupted by thermoelectric potentials. Every joint between dissimilar metals is a thermocouple and can easily generate dozens of microvolts, or more, for every degree Celsius of temperature difference.
Oh hang on the penny has dropped when I read this @Papabravo . We need a voltmeter in the appropriate range don't we? OK so the current ranges that are being recommended by Megger is simply to do how well they can measure the volt drop over the resistor right? I'm going to mark this thread as solved.
 

Thread Starter

smooth_jamie

Joined Jan 4, 2017
107
Maybe now you understand why Ohm's Law applies.
Could you be anymore patronising? Go on insult me some more why don't you. Perhaps I should go back to primary school and re-educate myself? I'm clearly a simple person who can't grasp the basics so only waste your time replying if you really want to have the last word on this post.
 
Status
Not open for further replies.
Top