# Might be a stupid question to most...

Discussion in 'General Electronics Chat' started by kuera, Dec 29, 2014.

1. ### kuera Thread Starter New Member

Aug 17, 2012
23
0
but I was wondering.
I know how to calculate what resistor is needed to drop current to an LED if the source voltage is higher than that of the LED with the supply - (led voltage) divided by the required current. But my Voltage source is the same as the LED needs, thing is its a 1 watt LED (at 3.6 volts) but instead of drawing the full 270 mAmps I only want to to draw about 150, how would I go about working that out or what is the resistor I would need for it?

2. ### blocco a spirale AAC Fanatic!

Jun 18, 2008
1,541
408
Are you saying you have a 3.6V supply and your LED has a 3.6V forward voltage?

3. ### MrChips Moderator

Oct 2, 2009
14,263
4,178
It is not possible to calculate the resistance required without knowing the I-V characteristics of the LED (which is non-linear).

Do this experimentally by trying different values of resistances until you get the desired effect.

4. ### kuera Thread Starter New Member

Aug 17, 2012
23
0
It's a 3.6 volt, 1 watt LED (about 0.270 amp)
so yeah I see what you're saying. I'll just mess around ^^ thanks

Aug 17, 2012
23
0
yep

6. ### blocco a spirale AAC Fanatic!

Jun 18, 2008
1,541
408
Then you will need to step up the voltage. Google "Joule thief" and you will find many such circuits.

7. ### wmodavis Well-Known Member

Oct 23, 2010
739
150
To reduce the current through the LED you need to reduce the voltage applied to the LED. A resistor would do that or any of a myriad of other possibilities.