How did 4-20mA become the industry standard for tranducers/transmitters?

Thread Starter

McDuffie

Joined Apr 13, 2007
1
And why? Does anyone know the answer to this? I realize that this might not even be the right forum for this sort of question.

I am an electrical technician with a wee bit of a background in solid state electronics so I was attracted to this forum.

If this forum and this website is more devoted to very small electronics, as opposed to the whole world of electricity, then can someone please recommend a forum where everthing from audio amps to PLCs and variable freq. motor controllers are discussed.

Thanks

McD
 

thingmaker3

Joined May 16, 2005
5,083
Welcome to All About Circuits. We try to take the "All" part pretty seriously.

The ISA came up with the 4-20mA standard in the '70s (I think). It was a standard for analog i/p into digital systems. There was some kind of debate about the lower end (the "4" part) but everyone was in agreement about the high end (the "20" part).

I was too busy with girls and cars at the time. Perhaps a more seasoned forumite remembers...
 

beenthere

Joined Apr 20, 2004
15,819
At a guess - the 4 - 20 milliamp I/F is low-impedance, and so less likely to have noise problems. At about the time it came into being, I knew a tech working on an assembly line. The high-impedance noise dodge was to pass inputs through 1K of resistance, with .1 microfarad bypass to ground. His job was lugging around a box of 6809's and replacing the ones that had fried (shades of ENIAC).
 

Tube Tech

Joined Jan 11, 2007
46
It's a carryover from teletypes.

* for multiplication in computer programs, / for divide and ^ for raising a number to a power are also teletype holdovers. Teletypes were the high-tech solution to I/O in the early days. Before video.

If we had gone straight to graphics and keyboards, we would have add, subtract, multiply, divide, square root and superscript keys.
 

antseezee

Joined Sep 16, 2006
45
The reasons above are good. However, most people wonder why didn't they just range the minimum from 0 mA to 20 mA. The key with a 4 mA minimum is that this says the sensor is operating at minimum range. If a wire disconnects, the current should be 0 mA. If the minimum was set to 0 mA, you would not be able to distinguish from a disconnection in the system, or if the sensor was at a minimum.
 

hgmjr

Joined Jan 28, 2005
9,027
Here is a thread on the topic of 4-20ma History I found in another forum. There may be some links there that will prove informative to those whose curiosity has been piqued by this thread.

hgmjr
 

kender

Joined Jan 17, 2007
264
The reasons above are good. However, most people wonder why didn't they just range the minimum from 0 mA to 20 mA.
There are at least 2 main reasons that 4mA was chosen over 0mA.
- 4mA can power some circuitry on the sensor side.
- 4mA provides the "live zero". On other words, it's possible to tell if the output of the sensor is zero, or the wires going to the sensor broke.

Current loop is more immune to noise (EMI) - that was another driving factor behind 4-20mA standard.
 

Eddy Kurent

Joined Apr 9, 2007
18
Also, I've read that if a 250 ohm resistor is placed in series, the voltage across that resistor varies from 1 volt at 4 mA to 5 volts at 20 mA. Sort of a standard sort of thing, I guess.
 

JoeJester

Joined Apr 26, 2005
4,390
It's a carryover from teletypes.
Most Teletypes I had on stations used 60 mA loops. Back when the jump from 45 baud to 75 baud was fast :)

As transmission speeds increased, 60 mA became a hinderance, so it dropped to 20 mA. As time progresses, even the 4 mA will become a burden.

The son's and daughters of the son's and daughters of the son's and daughters will wonder why so much energy was wasted when 4 mA was the standard. :)
 

kender

Joined Jan 17, 2007
264
The son's and daughters of the son's and daughters of the son's and daughters will wonder why so much energy was wasted when 4 mA was the standard. :)
Because 4mA provides the "live zero". In an industrial application it's useful to know the difference between the device that outputs zero (4mA) and a dead or disconnected device (0mA).
 
Top