How did 4-20mA become the industry standard for tranducers/transmitters?

Discussion in 'General Electronics Chat' started by McDuffie, Apr 13, 2007.

  1. McDuffie

    Thread Starter New Member

    Apr 13, 2007
    And why? Does anyone know the answer to this? I realize that this might not even be the right forum for this sort of question.

    I am an electrical technician with a wee bit of a background in solid state electronics so I was attracted to this forum.

    If this forum and this website is more devoted to very small electronics, as opposed to the whole world of electricity, then can someone please recommend a forum where everthing from audio amps to PLCs and variable freq. motor controllers are discussed.


  2. thingmaker3

    Retired Moderator

    May 16, 2005
    Welcome to All About Circuits. We try to take the "All" part pretty seriously.

    The ISA came up with the 4-20mA standard in the '70s (I think). It was a standard for analog i/p into digital systems. There was some kind of debate about the lower end (the "4" part) but everyone was in agreement about the high end (the "20" part).

    I was too busy with girls and cars at the time. Perhaps a more seasoned forumite remembers...
  3. beenthere

    Retired Moderator

    Apr 20, 2004
    At a guess - the 4 - 20 milliamp I/F is low-impedance, and so less likely to have noise problems. At about the time it came into being, I knew a tech working on an assembly line. The high-impedance noise dodge was to pass inputs through 1K of resistance, with .1 microfarad bypass to ground. His job was lugging around a box of 6809's and replacing the ones that had fried (shades of ENIAC).
  4. Papabravo


    Feb 24, 2006
    In any kind of current loop signaling the reason for having a non-zero lower limit is to tell the difference between "zero" and "open".

    From the SCADA Handbook

    Click on the Chapter 6 button
  5. Tube Tech

    Active Member

    Jan 11, 2007
    It's a carryover from teletypes.

    * for multiplication in computer programs, / for divide and ^ for raising a number to a power are also teletype holdovers. Teletypes were the high-tech solution to I/O in the early days. Before video.

    If we had gone straight to graphics and keyboards, we would have add, subtract, multiply, divide, square root and superscript keys.
  6. antseezee

    Active Member

    Sep 16, 2006
    The reasons above are good. However, most people wonder why didn't they just range the minimum from 0 mA to 20 mA. The key with a 4 mA minimum is that this says the sensor is operating at minimum range. If a wire disconnects, the current should be 0 mA. If the minimum was set to 0 mA, you would not be able to distinguish from a disconnection in the system, or if the sensor was at a minimum.
  7. hgmjr

    Retired Moderator

    Jan 28, 2005
    Here is a thread on the topic of 4-20ma History I found in another forum. There may be some links there that will prove informative to those whose curiosity has been piqued by this thread.

  8. kender

    Senior Member

    Jan 17, 2007
    There are at least 2 main reasons that 4mA was chosen over 0mA.
    - 4mA can power some circuitry on the sensor side.
    - 4mA provides the "live zero". On other words, it's possible to tell if the output of the sensor is zero, or the wires going to the sensor broke.

    Current loop is more immune to noise (EMI) - that was another driving factor behind 4-20mA standard.
  9. Eddy Kurent


    Apr 9, 2007
    Also, I've read that if a 250 ohm resistor is placed in series, the voltage across that resistor varies from 1 volt at 4 mA to 5 volts at 20 mA. Sort of a standard sort of thing, I guess.
  10. JoeJester

    AAC Fanatic!

    Apr 26, 2005
    Most Teletypes I had on stations used 60 mA loops. Back when the jump from 45 baud to 75 baud was fast :)

    As transmission speeds increased, 60 mA became a hinderance, so it dropped to 20 mA. As time progresses, even the 4 mA will become a burden.

    The son's and daughters of the son's and daughters of the son's and daughters will wonder why so much energy was wasted when 4 mA was the standard. :)
  11. kender

    Senior Member

    Jan 17, 2007
    Because 4mA provides the "live zero". In an industrial application it's useful to know the difference between the device that outputs zero (4mA) and a dead or disconnected device (0mA).