Simple LED math

Discussion in 'General Electronics Chat' started by Doktor Jones, Sep 2, 2016.

  1. Doktor Jones

    Thread Starter Active Member

    Oct 5, 2011
    57
    1
    I'm trying to make a simple, compact USB-powered "flashlight" (before anyone says "but you can buy them cheaper on eBay", my reason for doing this is "because I can", and/or "I'm a masochist and enjoy designing and implementing circuits in Eagle CAD").

    To keep things simple, compact, and cheap, my grand idea is to simply use the PCB itself as the USB connector, and have an LED and resistor.

    The LED is rated at Vf(typ) = 3.7V, Vf(max) = 4.0V, If(typ) = 350mA, and If(max) = 500mA (datasheet here)

    The resistor I've chosen is a 4.7Ω 3/4W (datasheet here, specific part # CRCW12104R70JNEAHP)

    Assuming USB can output up to 5.2V (some supplies do this to counter voltage drop across thin/long USB cables), this is the math I've done:

    ========================================

    Desired voltage drop: 5.2V - 3.7V = 1.5V

    To drop that voltage at 350mA, I need a (1.5V / 0.35A ~) 4.3Ω resistor

    Assuming the resistor is indeed dropping 1.5V @ 0.35A, it's dissipating (1.5 * 0.35 ~) 0.53W of power

    The closest higher rating that's cheap and plentiful is 4.7Ω, which is what I found.

    ========================================

    Here's where my math fails. Since I'm using a higher resistor value, how do I calculate how much current the LED will actually draw?

    Is it 1.5V (resistor drop) / 4.7Ω = 319mA?
     
  2. AlbertHall

    Well-Known Member

    Jun 4, 2014
    1,931
    382
    Yep (given the voltages you say).
     
Loading...