Transmitter using IR LEDs

Thread Starter

sharkfire

Joined Feb 13, 2008
23
Is this advisable to use IR LEDs as a transmitter? The receiver that we used is a photodiode.

How many ohms should we use to keep a distance of 1 foot from the IR LED and the photodiode? We tried 330 ohms. Sometimes the range is short, sometimes it is more than 1 foot.

Thanks in advance..
 

Salgat

Joined Dec 23, 2006
218
You can purchase specialized ultra bright IR LEDs and Photorecievers. For example, the ones we recently bought are sensitive to 33KHz, which makes them immune to noise from flourescent lighting and other sources of noise.
 

scubasteve_911

Joined Dec 27, 2007
1,203
The distance is a matter of IR emitter power along with photodiode sensitivity to the particular wavelength. Divergence of the emitter has a lot to do with it too... The resistor values set the current, which is proportional to radiated IR. These are the main relationships you should understand to design a simple system.

As you increase speeds, photodiode capacitance becomes an issue. Switching photodiode type, biasing it, and having a good transimpedance circuit become important factors.

Steve
 

SgtWookie

Joined Jul 17, 2007
22,230
Most television remote controls use IR LEDs nowadays. They generally transmit their data on a 38kHz to 40kHz carrier, and can have a range exceeding 25 feet.

It's quite common (and cheap) to simply use resistors to limit the current through LEDs. However, implementing a constant current circuit is much better, as you then are reasonably independent of your supply voltage.

LEDs may be supplied with very limited information, and sometimes no information at all.
For example, Radio Shack sells what they claim to be a "High-Output Infrared LED" as Catalog #: 276-143 for $1.99, but for specifications simply state: 5mm 1.2VDC 29mA 940nm. It does not state if these are typical or maximum limits for V@I.

If you wished to use simple fixed resistors for use from a regulated supply (for this example, 5V), you could start off by subtracting the stated LED operating voltage from the supply voltage:
5V - 1.2V = 3.8V
Then, a safe resistance (let's go for 10mA current for the moment)
R = E / I
R = 3.8V / 10mA
R = 3.8 / 0.01
R = 380 Ohms
So, use a 380 Ohm resistor in series with the LED across the 5V supply, and measure the actual voltage drop across the LED.
Let's say you measure 1.1V. Let's re-calculate for our final resistor value.

We don't know if the 29mA is the maximum, or recommended value. Let's assume it's the maximum, and use a 10% safety margin so we don't risk burning the LED up in it's first hour of use. 29mA * 90% = 26.1mA

5v - 1.1V = 3.9V (voltage to drop across resistor)
R = 3.9V / 26.1mA (current to limit through resistor)
R = 149.4 Ohms
150 Ohms is the next higher standard value.

But if your circuit is battery powered, you'd be far better off with a current limiting circuit. These are pretty easy to make using a standard red LED, a couple of resistors and a transistor. They can also be made using a TL431, which is a shunt regulator, also known as a "variable Zener".
 
Top