Ahh so that's why I was only measuring 10ma over one led, I found that if I put a resistor of 10ohm before the strip that dropped a lot, but without much loss of brightness, but I then took it off as the resistor got so hot I couldn't touch it, so I binned that idea and came for some advice. Not sure how safe that resistor would have been, I tried a 100ohm resistor too which dropped the current even more but the lights were very dim with that added. I was just messing around trying to learn from mistakes. I may try those lights you linked in a future project as I do like that they dont show the single light sources, thank youYes. That is, lower power, which is measured in W (Watts). The formula is W = V * A, so it is a product of both.
By lowering the voltage you lower the current, and the total power is reduced. Heat is proportional to power dissipation, and heat is what kills LEDs (and pretty much every other kind of component).
The area of LED power requirements can be confusing to a beginner, but don’t despair it is actually not very complicated. The reason for the difficulty is the nature of LEDs. They are non-linear devices, that is, their resistance changes with the current that passes through them. That’s why a separate current limiting resistor is required.
A plain LED will reduce in resistance, and so the amount of current it will try to draw, the more current you give it. You can see how this is a bad thing if the power supply can provide enough current to turn let out the magi smoke—the LED will commit self-harm in a moment.
With the correct current limiting resistor in place, there is a guarantee from an ohmic device that the current will not exceed the maximum the LED can handle. A resistor is ohmic because it doesn’t change resistance value based on the current passing through it¹. The LED is non-ohmic because it does. These designations refer to the relationship of each device to Ohm’s law. The ohmic device has a simple one, the non-ohmic device requires taking that changeability into account.
As far as the proper current goes (ultimately, we use the current at the devices forward bias voltage as the test for how much power we will be supplying) the best way to know is to find a datasheet for the device. In the absence of one, the general expectation is that an ordinary LED without a heatsink will be properly powered at 20mA. This doesn’t have to be correct but it is unlikely to be too much.
So, when you lowered the voltage you lowered the current the supplied current limiting resistor would permit to flow. At 10mA you are doing very well so far as longevity goes. If they are bright enough at that power level, that’s great. The goal is to run them as cool as you an while still having sufficient light output.
The strips I linked above are 5mm. I also misspoke a bit. I forgot that I found they had cut marks on them. When I recieved them I thought they didn’t, so cutting them would require finding the proper place through the phosphor, that’s not necessary. The only complication is that the cut marks are on the back, under the adhesive liner, to you have to peel it to find them. Not a fatal flaw, but a bit annoying.