LED - Wattage Calculation

DickCappels

Joined Aug 21, 2008
10,152
The simple answer is that its the current through the LED times the voltage drop across the LED. For the case of maximum average current, that would be 30 milliamps at nominally 3.6 volts = 108 milliwatts. The actual voltage depends on the forward voltage rank of your particular diode, found by matching the part number for your diode with the codes in the table of electro-optical characteristics table on the second page of the data sheet.

The data sheet is missing the chart showing how forward voltage varies with temperature, so just use the numbers given in the table on the second page and assume the maximum room temperature current of 30 milliamps.

From the derating curve on the third page of the data sheet, when operating at 30 milliamps, the temperature at the LED mounting point, which is the solder terminals, cannot exceed 50 degrees. If you are going to operate at a maximum air temperature of 40 degrees, you will need a heatsink with a thermal resistance with less than (50°-40°)/0.108 watts = 93 degrees per watt.

If you have multiple LEDs per heatsink, the thermal resistance of the heatsink to ambient must be proportionally lower. (10 LEDs need 9.3 degrees per watt, etc.).

It is prudent to run the LEDs at a slightly lower current than the specified maximum.
 
Top