DC power supply voltage dropping from 6V to 3V when hooking up string of 20 parallel LEDs

Thread Starter

xbonet

Joined Nov 19, 2018
10
I bought a small string of 20 LEDs (the kind that use SMD LEDs soldered in parallel to copper enamelled wire and then covered in resin) and I'm planning on connecting them to another project I have, through USB, so I don't need to continually buy CR2032 batteries for them.

Now, the weird thing is that they use two CR2032 batteries, i.e. 6V. The LEDs are warm white, so I'm assuming their forward voltage is somewhere between 3.2-3.5V. As I said, they're just wired in parallel and there's no resistance I can see (unless it's built in to the LEDs themselves), so I'm not sure why they need two 3V batteries and I'm not sure where the extra 3V are actually going.

So I hooked them up to my variable voltage DC power supply and I found out that once the voltage gets up to around 3.2V (which is the beginning of the forward voltage threshold I had in mind) it plateaus and stays there. If I set the voltage at 6V to start with and then hook up the LEDs my voltage readout drops to 3.2V. So the actual voltage that's being output by my variable power supply is limited to 3.2V, no matter how much I ramp it up (I haven't gone too far over what I calculate are 6V when turning the pot, as I don't want to burn out the LEDs in the name of exploration, because I need them!)

I then got out my breadboard and hooked them up to a couple of resistors there (47, 100, 150 ohms —because I wanted to regulate their brightness anyway) and even though I had 2 more extra LEDs (5mm, single LEDs one normal yellow, one super bright warm white) hooked up to the same circuit there's no voltage dip in my power supply.

Now, I'm wondering why this is happening? Is it normal (although I haven't seen a drop like this before when using LEDs) or is there something I can't see in the string that's actually regulating the voltage?
 

LesJones

Joined Jan 8, 2017
2,321
You sould not drive LEDs with a constant voltage. If you want to drive then from 6 volts then connect a resistor in series with EACH LED (Or buy specially matched LEDs that all have the same forward voltage rating for the rated current. If you use matched LEDs you can connect them directly in parallel with a common current limiting resistor.) The resistor value 6- LED forwards voltage divided by the rated current (Or the current which you plan to run them at. I would suggest not running them at more than about 70% of the maximum current rating.

Les.
 

Thread Starter

xbonet

Joined Nov 19, 2018
10
You sould not drive LEDs with a constant voltage. If you want to drive then from 6 volts then connect a resistor in series with EACH LED (Or buy specially matched LEDs that all have the same forward voltage rating for the rated current. If you use matched LEDs you can connect them directly in parallel with a common current limiting resistor.) The resistor value 6- LED forwards voltage divided by the rated current (Or the current which you plan to run them at. I would suggest not running them at more than about 70% of the maximum current rating.

Les.
Hi, Les! Thanks for your response. And thanks for the recommendation. Yes, I have been reading a lot about driving LEDs lately, and one common must-do I find is that each LED should have its own resistor if wired in parallel. However, in this case what I'm specifically interested in understanding is the reason behind the voltage drop in my controlled power supply. I've never seen this happen with my power supply, no matter what I've connected to it; and I've connected loads of LEDs —single LEDs and strings of them— but this is the first time I've seen the power drop and plateau.

I was always going to put a resistor on this string, as I don't want it to give out so much light, so that's working OK; but I'd really like to understand why this other was happening. And if it's normal or if there's something else interesting going on that I might learn from.
 

wayneh

Joined Sep 9, 2010
16,102
My guess is that the combined current draw of your LEDs exceeded the capacity of the power supply and it couldn’t keep up. It inadvertently protected you from destroying your LEDs.
 

Thread Starter

xbonet

Joined Nov 19, 2018
10
Did you have a current limit set on the supply?
What is the rated maximum current the supply can provide at 6V?
Mmm, that's interesting! I once measured the supply's current at it's max 12.6VDC and got 0.05mA, and just assumed that was the supply's current. I never actually measured it throughout. But after your comment, I've just gone and did jut that, and the result was that my multimeter doesn't measure anything until 3.8V, when it starts measuring 0.01mA (so the current below that is lower than this minimum threshold my multimeter has —the power supply's minimum voltage is 1.2V); then at around 6.3V it goes up to 0.02mA; from then on, it fluctuates between 0.02mA and 0.05mA (this is something else I don't understand: why does it fluctuate? Probably my power supply or my multimeter —or both— are not very good quality).

What I find really interesting is that the voltage where it plateaus with the LED string is not far off from the voltage where the currents starts being measured by the multimeter. Although perhaps it's just coincidental.

Anyway, I'm not sure if the the power supply has a current limit and it had no rating... or any info for that matter! It's one of the cheap DIY project builds from either Gearbets, Bangood or Aliexpress. This is it, with its components on display:

IMG_2528.JPG
 
Last edited:

LesJones

Joined Jan 8, 2017
2,321
If you only had 0.02 mA flowing through the LEDs they would be VERY dim. Show us EXACTLY how your multimeter is connected while measuring the LED current. also show us which sockets the test leads are plugged into on your multimeter. Some meters have an extra socket fore measuring higher currents.

Les.
 

wayneh

Joined Sep 9, 2010
16,102
And more importantly, STOP until you understand how to properly use your meter to measure current. It MUST be in series with the load, not the usual parallel connection you would use for voltage measurement. Otherwise you'll cause a short and possibly damage both your power supply and blow the fuse in your meter.
 
Last edited:
Top