I've done quite a bit of reading and still a little confused as to what the difference between constant current and constant voltage drivers are.
I'm working with LED's in a +12v DC car battery environment. Now let's say I had 3 high power LED's (I dont know the wattage of the LED's but I know they are high power because they are on heatsink plates) All 3 LED's are connected in parallel. Now, I've managed to succesfully run this LED setup on a 350mA (1-2W) constant current driver that has a +12vDC input for a long time without problems. But just having a discussion about it with someone has got me wondering why constant current driver and not constant voltage? It's just by chance that I chose the constant current driver.
Would be interesting to know. I've even tried running 3 LED's in series with this same driver and it also works fine. The difference is that if I run in series the voltage output of the driver becomes 10 point somethin volts, whereas if i wire in parallel then it measures about 3.4V. I'm under the impression that in parallel it will deliver the 350mA divided equally across the 3 LED's whereas in series it will deliver 350mA to each LED and increase the voltage.
Just wondering is constant current the correct driver or should I really have gone for constant voltage?
Thanks
I'm working with LED's in a +12v DC car battery environment. Now let's say I had 3 high power LED's (I dont know the wattage of the LED's but I know they are high power because they are on heatsink plates) All 3 LED's are connected in parallel. Now, I've managed to succesfully run this LED setup on a 350mA (1-2W) constant current driver that has a +12vDC input for a long time without problems. But just having a discussion about it with someone has got me wondering why constant current driver and not constant voltage? It's just by chance that I chose the constant current driver.
Would be interesting to know. I've even tried running 3 LED's in series with this same driver and it also works fine. The difference is that if I run in series the voltage output of the driver becomes 10 point somethin volts, whereas if i wire in parallel then it measures about 3.4V. I'm under the impression that in parallel it will deliver the 350mA divided equally across the 3 LED's whereas in series it will deliver 350mA to each LED and increase the voltage.
Just wondering is constant current the correct driver or should I really have gone for constant voltage?
Thanks