Very simple project

Discussion in 'The Projects Forum' started by lnz423, Feb 21, 2013.

1. lnz423 Thread Starter New Member

Feb 21, 2013
5
0
Hey everyone. I'm new to the forum and have a little project that I'm trying to complete.

I'm trying to add an LED to the circuit of a fan so that I can have an indication of when the fan is running. [EDIT: There is a switch between the fan and the power source that controls the fan. That's why I want the LED. I just didn't draw it in the circuit.] I just don't want to install the LED and break something in the process.

I measured the resistance across the fan and it was 0.3 ohms but that doesn't seem right to me. Can you measure resistance on a fan motor like that?

Ok well I drew up the circuit on some free online software and here is what I came up with.

The "M" element is the fan with the .3 ohm resistance. The battery will be at 14 V constant.

I looked up an LED at RadioShack and the specs were max 20mA and 3.0 V so the resistor in the circuit shown would have to be a minimum of 550 ohms right?

I have an LED and I don't know the specs. I tried measuring the resistance but I soon researched and found out that you can't measure LEDs like that.

I think all I would need to get a good visible light from the LED I have now would about about 2V. So could I just add a 900 ohm resistor in series with this LED that I have (which should give my LED 2V)? Would that be ok?

2. #12 Expert

Nov 30, 2010
16,655
7,293
You must stop thinking about LEDs as voltage operated. They aren't.
What you do is figure the current you want and the LED decides the voltage it needs. It's pretty predictable.

14 volts available minus 3 for the LED and you figure 11V/.01 amp.
or 11V/.02 amps (maximum).

Your resistor will be in the range of 1100 to 550 ohms.
P=I*I*R
.01 squared times 1100 = .11 watts
.02 squared times 550 ohms = .22 watts

As hard as it can go will require a 1/2 watt resistor (because we always double for a safety factor)
and half bright will require a 1/4 watt resistor

Now that you know the math, you can choose a resistor by yourself.

Last edited: Feb 21, 2013
lnz423 likes this.
3. thatoneguy AAC Fanatic!

Feb 19, 2009
6,357
718
LED rated 3V @ 20mA

LEDs are current devices, not voltage, so the resistance function won't work.

They do show different voltages with different currents, but it isn't linear.

All you need to do is limit the current through the LED.

If the supply is 14V, the resistor is calculated with Ohms Law V=IR where V is volts, I is current, and R is resistance.

You know your Vf of the diode is 3V @ 20mA, so we calculate what value resistor is needed to only allow 20mA through the LED when 14V is applied:

$
\mathsf{R=\frac{V}{I}}
\mathsf{R=\frac{14V_{Supply}-3V_{LED_V_f}}{20mA}
\mathsf{R=\frac{11V_{drop}}{20mA}}
\mathsf{R=550\Omega}$

Next highest Standard Value is a 560Ω Resistor, so that should work fine.

Put resistor in series with LED, and then both of those across the fan supply, as sort of shown in your diagram.

lnz423 likes this.
4. lnz423 Thread Starter New Member

Feb 21, 2013
5
0
Interesting. I've never really worked with LEDs before and I didn't know. Thanks for the advice. I guess I can just choose the resistor that will supply less than the max current.

Yeah that's what I was thinking. I just didn't know about the voltage issue. Thanks.

What do I do about the LED that I have that I know nothing about? Is there anything I can do to test what current I should give it? Or should I just buy a new LED?

5. #12 Expert

Nov 30, 2010
16,655
7,293
Try it in the same circuit you're building today. I've never seen one that wouldn't survive .02 amps.

Possible problem: they don't like to go in backwards with anything above 5 volts. They break. Maybe (3) AA batteries for 4.5 volts, so it won't break if you accidently try it backwards. 1.5 leftover volts and 10 ma = 150 ohms in series.

6. lnz423 Thread Starter New Member

Feb 21, 2013
5
0
Ok that sounds good. I'll try that.

How do I know if it's backwards? I know that when I test it with my multimeter, the multimeter will supply enough current to light it when hooked up one way, but not the other. So is the right polarity the one where it lights up? (I'm assuming my multimeter's positive lead supplies the current.)

7. MrChips Moderator

Oct 2, 2009
12,622
3,451
It depends on your multimeter and whether it is analog or digital. On an analog multimeter, current is supplied from the negative terminal.

Test your multimeter with an LED of known polarity.

8. lnz423 Thread Starter New Member

Feb 21, 2013
5
0
Ok I found this picture about the polarity. https://encrypted-tbn2.gstatic.com/...t11G0Gf8uQabf55YUQap0zEWnPYbIaTZp78fALVcSpDSd

I can see the inside of the LED so I think I can get the right polarity.

Should I just buy a resistor with over 550 ohms and wire it in? I figure it should be able to handle 20mA right? Maybe I'll wire it in with a 1k resistor first to see if it gives me enough light and if not I'll change it to a less resistance.

9. tracecom AAC Fanatic!

Apr 16, 2010
3,878
1,396
Many LEDs look almost as bright at 10 mA as they do at 20 mA. I would try a 1k and see if it's bright enough.

Apr 5, 2008
15,796
2,382
tracecom likes this.
11. tracecom AAC Fanatic!

Apr 16, 2010
3,878
1,396
Now, that I can remember.

12. lnz423 Thread Starter New Member

Feb 21, 2013
5
0
Awesome. I happened to have a 1k resistor and an LED so I won't need to buy anything. I measure the resistor and it actually measures 981 ohms so that should be about perfect.

Thanks! That's a pretty good way to remember it.

I love this forum. Everyone is so responsive and helpful. I appreciate it.