Why do you need a series resistor?

What is the correct series resistance to use when driving an LED?

These are frequently asked questions.

One possible answer is: You do not need a series resistor.

The proper answer is: A series resistor is required to limit the current through the LED and prevents destroying the LED.

In this tutorial you will learn in what circumstances a series resistor is not necessary. You will also learn when you should use a series resistor and how to calculate the resistance required.

So let's get started.

An LED (Light Emitting Diode) is a P-N junction semiconductor device. As with all P-N diodes, the LED is a non-linear device and does not obey

Stated mathematically: I = V / R

The I -V graph of various resistances might look like this:

Note that the slope of the line dI/dV is the reciprocal of the resistance. In other words, a small resistance shows a steep slope while the larger resistance is less steep. Resistors are linear devices and hence the I-V curve for a resistor is a straight line.

In contrast, here is the I-V curve of a typical red LED.

LEDs and semiconductor P-N junction devices (diodes) are non-linear devices. They do not obey Ohm's Law. In particular, the I-V curve follows the

The

The

It is the low dynamic resistance that gives us grief and presents us with the risk of blowing the LED.

The dynamic resistance of the LED with a supply voltage less than 1V is very high, i.e. the current through the LED is very low. When the supply voltage rises above the turn-on voltage of the LED the dynamic resistance falls rapidly and the current rises exponentially with voltage. A tiny increase in supply voltage will cause the LED to conduct a very large current, at risk of destroying the LED. For this reason, we need to control the LED current, not the voltage. We need a

Now that we have covered the fundamentals, let us tackle the question of what is the desired series resistor when driving an LED.

Let us assume for this discussion that we have a red LED with given test conditions of forward current IF of 20mA and forward voltage VF of 2V.

We can drive this LED with a constant voltage source of 2V and no series resistor. However, there are detrimental consequences of doing so. Let us outline the problems with this.

Since the 2V, 20mA point on the LED I-V curve is on a sharp slope (dynamic resistance is low) a tiny increase in the supply voltage will result in a very large increase in LED current.

In order to prevent this from happening, it is recommended that a series resistor be added to the drive circuitry. How much resistance do we need?

In electronic circuit analysis, adding a resistor in series gives us what is known as a

The 10Ω resistor (green line in graph above) does not do much to stabilize the current. The load line is too steep.

The 100Ω series resistor (cyan line) and the 200Ω series resistor (blue line) offer better solutions. The trade off with adding a series resistor is that we now have to increase the supply voltage. With a 100Ω series resistor, the supply voltage needs to be about 4V and 6V with the 200Ω series resistor. That is, we need to provide some substantial voltage overhead. (See where the load line touches the voltage axis.)

Looking at it from a different perspective, if the supply voltage is 6V then the required series resistance is

(6V – 2V) / 20mA = 200Ω

Ideally, we want a resistor load line to intersect the LED I-V curve at a right angle. In other words we want as large a resistance as possible. Putting it another way, we need to increase the supply voltage to as large as we can accommodate. The more voltage overhead we provide, the better able we are at stabilizing the current.

The take away here is, the larger you can make the series resistance the closer you get to creating a

If your LED driver is already a constant current source, then you do not need a series resistor. An ideal constant current source has infinite internal resistance.

What is the correct series resistance to use when driving an LED?

These are frequently asked questions.

One possible answer is: You do not need a series resistor.

The proper answer is: A series resistor is required to limit the current through the LED and prevents destroying the LED.

In this tutorial you will learn in what circumstances a series resistor is not necessary. You will also learn when you should use a series resistor and how to calculate the resistance required.

So let's get started.

An LED (Light Emitting Diode) is a P-N junction semiconductor device. As with all P-N diodes, the LED is a non-linear device and does not obey

**Ohm’s Law**. An ideal resistor is a linear device and it obeys Ohm’s Law. Ohm’s Law states that the current (I) through the resistor is directly proportional to the applied voltage (V) and inversely proportional to the resistance (R).Stated mathematically: I = V / R

The I -V graph of various resistances might look like this:

Note that the slope of the line dI/dV is the reciprocal of the resistance. In other words, a small resistance shows a steep slope while the larger resistance is less steep. Resistors are linear devices and hence the I-V curve for a resistor is a straight line.

In contrast, here is the I-V curve of a typical red LED.

LEDs and semiconductor P-N junction devices (diodes) are non-linear devices. They do not obey Ohm's Law. In particular, the I-V curve follows the

**Shockley diode equation**which mathematically describes the diode current as rising exponentially with increasing forward voltage.**Aside:**Note the difference between static (DC) resistance and dynamic (AC) resistance.The

**static resistance**of this test LED at 20mA is approximately 2V/20mA = 100Ω.The

**dynamic resistance**at the 2V, 20mA range is approx. 1V/100mA = 10Ω, i.e. the ΔI / ΔV slope is more steep than I / V.It is the low dynamic resistance that gives us grief and presents us with the risk of blowing the LED.

The dynamic resistance of the LED with a supply voltage less than 1V is very high, i.e. the current through the LED is very low. When the supply voltage rises above the turn-on voltage of the LED the dynamic resistance falls rapidly and the current rises exponentially with voltage. A tiny increase in supply voltage will cause the LED to conduct a very large current, at risk of destroying the LED. For this reason, we need to control the LED current, not the voltage. We need a

**constant current source**.**On with the show**Now that we have covered the fundamentals, let us tackle the question of what is the desired series resistor when driving an LED.

Let us assume for this discussion that we have a red LED with given test conditions of forward current IF of 20mA and forward voltage VF of 2V.

We can drive this LED with a constant voltage source of 2V and no series resistor. However, there are detrimental consequences of doing so. Let us outline the problems with this.

- The LED VF may not be exactly 2V. There is variability in actual LED parameters from device to device.
- No voltage source will be exactly 2V.
- Both (1) and (2) both vary with temperature.

Since the 2V, 20mA point on the LED I-V curve is on a sharp slope (dynamic resistance is low) a tiny increase in the supply voltage will result in a very large increase in LED current.

In order to prevent this from happening, it is recommended that a series resistor be added to the drive circuitry. How much resistance do we need?

In electronic circuit analysis, adding a resistor in series gives us what is known as a

**load line**. Graphically, the load line allows us to solve the two simultaneous equations (the current and voltage through the resistor and the current and voltage through the LED). The intersection of the LED I-V curve and the resistor load line gives us the operating point of the LED. For any LED placed in the circuit, the solution must lie on the load line.The 10Ω resistor (green line in graph above) does not do much to stabilize the current. The load line is too steep.

The 100Ω series resistor (cyan line) and the 200Ω series resistor (blue line) offer better solutions. The trade off with adding a series resistor is that we now have to increase the supply voltage. With a 100Ω series resistor, the supply voltage needs to be about 4V and 6V with the 200Ω series resistor. That is, we need to provide some substantial voltage overhead. (See where the load line touches the voltage axis.)

Looking at it from a different perspective, if the supply voltage is 6V then the required series resistance is

(6V – 2V) / 20mA = 200Ω

Ideally, we want a resistor load line to intersect the LED I-V curve at a right angle. In other words we want as large a resistance as possible. Putting it another way, we need to increase the supply voltage to as large as we can accommodate. The more voltage overhead we provide, the better able we are at stabilizing the current.

The take away here is, the larger you can make the series resistance the closer you get to creating a

**constant current source**which is what we are trying to create in the first place.If your LED driver is already a constant current source, then you do not need a series resistor. An ideal constant current source has infinite internal resistance.