Might be a stupid question to most...

Discussion in 'General Electronics Chat' started by kuera, Dec 29, 2014.

  1. kuera

    Thread Starter New Member

    Aug 17, 2012
    21
    0
    but I was wondering.
    I know how to calculate what resistor is needed to drop current to an LED if the source voltage is higher than that of the LED with the supply - (led voltage) divided by the required current. But my Voltage source is the same as the LED needs, thing is its a 1 watt LED (at 3.6 volts) but instead of drawing the full 270 mAmps I only want to to draw about 150, how would I go about working that out or what is the resistor I would need for it?

    Thanks in advance.
     
  2. blocco a spirale

    AAC Fanatic!

    Jun 18, 2008
    1,439
    368
    Are you saying you have a 3.6V supply and your LED has a 3.6V forward voltage?
     
  3. MrChips

    Moderator

    Oct 2, 2009
    12,446
    3,361
    It is not possible to calculate the resistance required without knowing the I-V characteristics of the LED (which is non-linear).

    Do this experimentally by trying different values of resistances until you get the desired effect.

    Let us know your results.
     
  4. kuera

    Thread Starter New Member

    Aug 17, 2012
    21
    0
    It's a 3.6 volt, 1 watt LED (about 0.270 amp)
    so yeah I see what you're saying. I'll just mess around ^^ thanks
     
  5. kuera

    Thread Starter New Member

    Aug 17, 2012
    21
    0
    yep
     
  6. blocco a spirale

    AAC Fanatic!

    Jun 18, 2008
    1,439
    368
    Then you will need to step up the voltage. Google "Joule thief" and you will find many such circuits.
     
  7. wmodavis

    Well-Known Member

    Oct 23, 2010
    737
    150
    To reduce the current through the LED you need to reduce the voltage applied to the LED. A resistor would do that or any of a myriad of other possibilities.
     
Loading...