to use a resistor for an IC's input or to not.

Thread Starter

jlawley1969

Joined Feb 22, 2021
97
This is a simple and probably stupid question. When should I use a resistor to limit current into an IC. Most of the data sheets I have been reading do not give any suggestions or any info. That is except for one IC I was going to use that had an internal zener diode, that one said to have a "sufficiently high input impedance "lol

Anyways back to the main thing, for example I am using a PIC10F222 and it says it draws 1.1mA in the data sheet max but in the absolute max section it says "input clamp current : +-20mA". It is that last part I do not get, because if it only draws 1.1mA in what condition would it draw 20mA or more?

So when testing/prototyping I just have been putting a 150ohm resistor on all the inputs of my ICs (the other 2 are IRS2186's) and I mean it has worked fine but when I make the transition to PCB any space I could save would useful

thanks
 

Papabravo

Joined Feb 24, 2006
21,159
The simplistic answer for digital circuits is almost never. Like many things in life and electronics things are not really that simple. You can use a current limiting resistor if you believe that an input, which is usually high impedance may be exposed to unexpected voltage levels. Normally a high voltage on a digital input has the potential to do damage, but the resistor can limit the current to a level where the damage if any will be mitigated. Is that an absolute guarantee? Absolutely not. Is it worth doing? Probably not. Added components come with a cost in a product.

In a career lasting over half a century there was one case where series resistors were used in a digital circuit. The address lines on an array of DRAM chips had 33Ω series resistors to damp the ringing produced by fast edges between RAS and CAS. Resistors are sometimes used on analog interface circuits to limit current and diode clamps are used to limit voltage but come with a cost in added capacitance.
 

Bordodynov

Joined May 20, 2015
3,177
The microcircuits have protection diodes on the inputs. So, 20mA is the maximum allowable current through these diodes. These diodes are part of the parasitic thyristor structures. If the parasitic thyristor turns on, the chip will fail (go up in smoke). The current of the thyristor is more than 20 mA. At higher temperatures the thyristor control current decreases. There are special protective structures for inputs, which limit the input current even at high input overvoltage (e.g. 100V). At the same time the performance is not much degraded.
 

Lo_volt

Joined Apr 3, 2014
316
You are confusing your specs.

IDD is the current at the VDD pin. In other words, it's the power that the device will draw from the power supply through the VDD pin. The data sheet specifies certain conditions regarding that specification note in the footnote:

"3: The supply current is mainly a function of the operating voltage and frequency. Other factors such as bus loading, bus rate, internal code execution pattern and temperature also have an impact on the current consumption.
a) The test conditions for all IDD measurements in active operation mode are:All I/O pins tri-stated, pulled to VSS, T0CKI = VDD, MCLR = VDD; WDT enabled/disabled as specified.
b) For standby current measurements, the conditions are the same, except that the device is in Sleep mode."

Basically they're saying that all I/O pins are not sourcing or sinking current when this measurment is tested. As you add connections to the device IDD will increase.

Input clamp current is the maximum current that the device can handle when clamping the voltage at an input. This does not happen under normal use. The voltage is clamped at an input when it exceeds the power supply (VDD) plus the input clamping diode drop.
1623329698070.png
In this image D1 and D2 are clamping diodes. Ordinarily they are benign and only add a small amount of capacitance to the input circuit. If the input voltage exceeds the supply rail or goes below Ground, one or the other diode will turn on. This clamps the voltage into the device at one diode drop above the Supply rail or below Ground. Keep in mind that D1 and D2 will heat up as more current flows through them. The specification limits that current to 20mA.

I can quickly think of three situations where clamping diodes activate. The first is where the input accidently or inadvertently shorts to a high voltage rail or some voltage source below ground. It's most likely to happen when debugging or the like.

The second is when ringing or overshoot occurs on the rising or falling edge of a changing digital signal. This is the case where your input resistors will be quite effective.

The last case is where a static discharge strikes the input and attempts to flow into it. In this case it's hard to predict the effect on the device, but input resistors may offer some protection.

One note about input resistors is that they will slow the rise and fall times of your signals. At low frequencies that shouldn't be an issue. As your input signal frequency rises, it may interfere with the ability of the input to register the change. The higher the value the slower the rise time.
 

BobTPH

Joined Jun 5, 2013
8,809
I am using a PIC10F222 and it says it draws 1.1mA in the data sheet
Where in the datasheet did you get that figure?

The input leakage current for port pins is listed at ±1uA, 1000 times lower than what you quoted. This is the max that an input will source or sink when the voltage is within the normal range, i.e. within the power and ground voltages of the chip's supply.

The input clamp current is the max it can withstand when the input is outside the normal range.

Unless you think there will be voltages outside the normal range, a resistor is not needed to limit current.

Bob
 

MrChips

Joined Oct 2, 2009
30,708
There is no "one size fits all".

You need to specify the IC technology employed, IC datasheet, circuit application, voltages and frequency involved.
 

Ian0

Joined Aug 7, 2020
9,667
When should I use a resistor to limit current into an IC.
Depends on what's at the other end of the resistor!
If you're going to use it for mains zero crossing detector like in this application note:
http://ww1.microchip.com/downloads/...ero-Cross-Detector_ApplicationNote_AVR182.pdf
then don't leave it out.

If the resistor goes to some device that runs off the same power supply as the IC, then it is rarely necessary.
If it goes to a device running off a different power supply then it is a good idea. if one power supply is switched off, then the other would try to supply it via the protection diode; or even if the two supplies were slightly different voltages.
 

BobaMosfet

Joined Jul 1, 2009
2,110
This is a simple and probably stupid question. When should I use a resistor to limit current into an IC. Most of the data sheets I have been reading do not give any suggestions or any info. That is except for one IC I was going to use that had an internal zener diode, that one said to have a "sufficiently high input impedance "lol

Anyways back to the main thing, for example I am using a PIC10F222 and it says it draws 1.1mA in the data sheet max but in the absolute max section it says "input clamp current : +-20mA". It is that last part I do not get, because if it only draws 1.1mA in what condition would it draw 20mA or more?

So when testing/prototyping I just have been putting a 150ohm resistor on all the inputs of my ICs (the other 2 are IRS2186's) and I mean it has worked fine but when I make the transition to PCB any space I could save would useful

thanks
@jlawley1969 No such thing as a stupid question. Glad you asked--

There is a difference between working to limits and understand what you're doing. You have many responses here saying that essentially it does not matter- this is a WRONG (or sloppy/poor at best IMHO). And here is why:

An IC can only handle so much wattage through it's junctions. The more wattage you drive through any IC, the less it is able to perform ideally because it gets hot. This is what derating, and many of those charts are for that you see at the end of the datasheet. You also shorten the life of the IC (and any component) by running it hot. Furthermore, what if you decide to expand your design? If you use power judiciously from the beginning- your IC can handle more (for example an MCU) pins being used, and do more work. Plus, your voltage regulator or PSU can support your design longer without you having to select another because you're using more current than you need. The more current you use in a discrete signal, the slower it is to change, and harder it is for the component to change that signal- everything takes time. A perfect example is a instrument-grade OpAmp- at a very small current gain of 5, it can switch voltage stabily at 200kHz. But if you have a gain of 100, it can only switch voltage stabily at 1.8kHz.

You wanted a real reason. I've given you one. The first thing you look at on a datasheet is the Pd (power-dissipation) total. You then look at junction temperature derating, and then voltage and current and how it's used. Datasheets lie. They are written by vendors trying to make their part look better than a competitor and they lie through implication or omission, frequently. In order to see the truth, you must learn how to read a datasheet, and look at what is important so that you can see if the datasheet tells you, or if you must calculate it from what they provide. It does not matter that a datasheet says something can be run at 5V and Max current is 100mA, if the Pd says total dissipation is 250mW. Voltage or current has to give, or additional thermal cooling (fan or heatsink) must be used to keep the Pd at/below 250mW.

Always conserve power in your designs and you will thank yourself for it, later. Know what you're doing. A resistor's only job is to limit current. There are secondary and tertiary benefits in how that is used, but at the end of the day, a resistor does one thing. It resists the flow of electrons and dissipates that energy as heat. It heats so that other components don't have to so much.
 

BobaMosfet

Joined Jul 1, 2009
2,110
@jlawley1969 Since you asked a specific question about a specific chip, here:

1623336880994.png

The total Pd is 800mW That is (if you run it at 5V): I = P/I or I = 0.800/5 so I = 160mA. The total current the package can handle MAX is 160mA at 5V. You want to run it cooler than that, so for safety, cut it in half- 80mA. That gives you a margin.

Ideally, At 5V, if you use 4K7 Ohm resistors on input or output pins, that limits you to ~1mA at 5V - which is adequate for both most moise-immunity and is plenty for the IC to use for signaling purposes, both input and output. I recognize that space is limited, that is why you can get things like bussed/network resistors- more resistors in a tiny space:

I perfect example is this part:

1623337274108.png

OEM: L101S472LF (TT Electronics)
Vendor: 15M7274 (Newark/Element14)

You can obtain these affordably through mouser, digikey, arrow, allied, etc. Pin 1 is ground, all other pins are 4K7 resistors in parallel to pin 1.
 

Ian0

Joined Aug 7, 2020
9,667
A perfect example is a instrument-grade OpAmp- at a very small current gain of 5, it can switch voltage stabily at 200kHz. But if you have a gain of 100, it can only switch voltage stabily at 1.8kHz.
I cannot fault the rest of your argument, but this bit is a little dubious.
Instrumentation amplifiers tend to get more stable as the gain is increased (less feedback), and the effect you describe seems to be rather more to do with slew-rate limiting than stability.

I'd also like to add to your comments:
Maximum voltage = 5V and maximum current = 100mA may both be true, but they might not both be true simultaneously.
If Pd is 250mW, then 5V @ 50mA would be OK, as would 2.5V @ 100mA.

Allowing current through the protection diodes (even if limited with a resistor) does have a trap for the unwary. That current ends up in the power supply, in such a way as your voltage regulator has no control over it. It may raise the power supply voltage to such an extent it causes damage. If there is any likelyhood of a long-term over-voltage on an input pin, it is better to attenuate it, than to allow it through the protection diodes.
Over-voltage of short duration (spikes) generally do not add enough charge into the power supply to cause problems
 

Papabravo

Joined Feb 24, 2006
21,159
It all comes down to the exposure to stressful events. If the majority of the nodes on a board have limited exposure to offboard sources there is seldom a need to add protection to those nodes. It is at the interfaces to a board that concern should be directed. Making this decision can be criticized in some quarters, but when it comes to driving down the cost of production that consideration usually wins.
 

Ian0

Joined Aug 7, 2020
9,667
The distinction is between the maximum current that is drawn by the inputs, with the inputs operating at their normal voltages (1uA) and the current that can be forced into the input pins by a voltage outside the supply rails (20mA).
 

BobaMosfet

Joined Jul 1, 2009
2,110
I cannot fault the rest of your argument, but this bit is a little dubious.
Instrumentation amplifiers tend to get more stable as the gain is increased (less feedback), and the effect you describe seems to be rather more to do with slew-rate limiting than stability.

I'd also like to add to your comments:
Maximum voltage = 5V and maximum current = 100mA may both be true, but they might not both be true simultaneously.
If Pd is 250mW, then 5V @ 50mA would be OK, as would 2.5V @ 100mA.

Allowing current through the protection diodes (even if limited with a resistor) does have a trap for the unwary. That current ends up in the power supply, in such a way as your voltage regulator has no control over it. It may raise the power supply voltage to such an extent it causes damage. If there is any likelyhood of a long-term over-voltage on an input pin, it is better to attenuate it, than to allow it through the protection diodes.
Over-voltage of short duration (spikes) generally do not add enough charge into the power supply to cause problems
@Ian0 I think you missed my point about 5V and 100mA - I meant simultaneously. Many people who read datasheets think the published MAX V and MAX I mean the can achieve both those simultaneously and never look at Pd.

As for the amplifer issue- it was an older INA126- here's the graph:
1623351224701.png
That graph is telling you that as gain increases, the ability to maintain a frequency falls- that is because it can't slam current up and down that fast- As I undestand it, that technically is what slew rates are telling you - they describe the rate of change of voltage and current in relation to one another. Please educate me if I'm way off :)

I'm always interested in correcting my understanding of things even if I do so in a very grizzly resistive way :)
 
Last edited:

BobaMosfet

Joined Jul 1, 2009
2,110
The distinction is between the maximum current that is drawn by the inputs, with the inputs operating at their normal voltages (1uA) and the current that can be forced into the input pins by a voltage outside the supply rails (20mA).
@Ian0 - Um.... It doesn't work that way. Current is never 'forced' through something. It is attracted through it by the force of voltage- and whatever current is aloud to flow through that path will flow through it unless something prevents it.

If you don't see anything specifically stating the input current is limited to 1uA, then it will in fact be as much as the chip can manage (hence why they state the maximums). And as such- Resistors are required if you wish to control that.

I hunted through the entire datasheet and could find no reference to 1uA for the inputs.
 

Attachments

Last edited:

BobaMosfet

Joined Jul 1, 2009
2,110
This is the spec that states how much current is drawn by the input when the voltage is between Vss and Vdd:



Notice the voltage range of the spec you are looking at.


Bob
@BobTPH - That spec is for leakage current- and how I read it is it is saying that At a given voltage VPIN, the leakage additional can be between 1uA and 1mA. That is parasitic current if the input pin is set to a High-Z/Impedance state. That isn't input or output- that's just leakage current.

An input pin has (usually) a 2 transistor input stage, often with over and undervoltage protection All these elements will have some leakage from the power rail (when the input is low) and from the negative rail (when the input is high). That is where the leakage current comes from. That is a seperate issue from drive current for inputs and outputs.
 
Last edited:

BobTPH

Joined Jun 5, 2013
8,809
Don't know how what I posted got changed to mA, it is actually uA. Here is a screen clip from the datasheet:

1623354918738.png

Notice the range of input voltages over which this applies. It applies whenever the voltage on the input pin is between the power supply rails to the chip.

Now look at the spec you are quoting:

1623355545906.png

Notice that it applies ONLY when the voltage is less than 0 or higher than the supply voltage.

Input leakage current, though a strange way of saying it, is the spec used to determine how much current an input might source or sink.

My statements about not needing a resistor are qualified by the voltage staying in range.

You are also misinterpreting the absolute max specification. 20 mA is not the maximum current the pin might draw at any input voltage. If you put 100V on the pin, it will, albeit briefly, draw more than 20 mA. 20 mA is the maximum current you can allow to flow into a pin with damaging it.

Reading datasheets is a skill you have to acquire. They are not obvious.

Bob
 

Attachments

Top