I'm designing a bespoke IR illuminator and was hoping someone might be able to look over and correct my calculations for me if they could please.

Using a commercially produced IR LED array (5x parallel Strings of 3x series LED's) I have obtained the following numbers downstream of the original control circuitry in order to guesstimate the power consumption of the LED's contained within.

Total voltage across series (3x LED per series) = 4.3v

Voltage across single LED = 1.43v

Makes sense so far as voltage is multiplied with series, but current remains the same.

Total current draw across parallel strings (5x parallel Strings) = 1.2amp

Current draw across single series string (same as single LED?) = 0.21amp

Makes sense so far as current is multiplied with parallels, but voltage remains the same.

Am I correct enough assuming the individual LED's are close enough to 1.43v and 210ma?

I will be wiring 6x of these LED's in series being powered by a LM317 and have come up with 8.58v @ 0.20amp

Using a LM317 driver circuit and 1.25 as the reference voltage I arrive at a resistor value of 6.25ohm (or the next highest available) to power these LED's

Is it really that simple or am I missing something?

I'm also trying to wrap my head around the power dissipation of the resistor and am unsure whether to use volts and resistance, or current and resistance to calculate this. They both produce extremely different answers (8.58v@6ohm = 12.3w vs 0.20a@6ohm = 0.24w).

Any help or pointers in the right direction would be much appreciated, thanks!