[UPDATED] How should I power a 12V-DC, 12.3A LED circuit?

Thread Starter

-Ty-

Joined Feb 5, 2017
83
Why can't you just change the wire size? The cost is not that significant.
I've seen 25 ft of 2 conductor #12 wire for 10.00 on ebay.
With #12 wire you will only lose about .75 volts, I doubt if you will even notice the difference using a 12 volt supply
Another point, #18 wire is nominally rated at 10 amps max. and at 12.3 amps the wire will dissipate 38.5 watts or almost 2 watts per ft. Use #12 wire.
SG
There's a few extra complications involved in using a thicker wire... One factor is the weight, which is already considerable with the bulky power supplies. Another is having to connect a 10-gauge 4-conductor or 8-gauge 2-conductor wire to a small C13 connector, which is the only way to plug the power cord into the LEDs (I know, that was a mistake).

Please though, can you link me to wherever you found the maximum current ratings for different STRANDED wire gauges? I cant find a definitive answer because some ratings are for a specific voltage drop, while others are for "power transmission" and the like.
 

sghioto

Joined Dec 31, 2017
5,380
Then I would go back to your original plan and mount the power supplies on the panels using the Mean Well RPS-200-12-C. for the larger panels.
SG
 

MisterBill2

Joined Jan 23, 2018
18,176
One additional option is to use a power supply with remote sensing, although the PWM may confuse that function.
Consider though that you only need the full current and full voltage at the maximum brightness, and as the brightness is reduced less voltage would not be a problem. Also, consider that the voltage drop in the wires will not change as the duty cycle is reduced, because the on time is all that changes, the current will not change, only the power. So if the power supply is adequately regulated to be stable with the pulsing load, the power supply voltage will not change as the duty cycle is varied. A NON-REGULATED supply would be different. Didn't that occur to anybody else?
 

Reloadron

Joined Jan 15, 2015
7,501
You can buy 16 AWG - 2 Conductor - 600V - Stranded Conductor - Unshielded - VNTC Tray Cable for about $0.50 a foot if you are worried about the I*R loss. While I have not used the LED strip you mention I have used 5050 RGB LED strips rated for 12 VDC on motorcycle applications where the voltage varied between 12.6 and 14.6 volts with no problem and no difference in intensity. Matter of fact with an exact 12 VDC applied the difference between 80% and 100% pulse width was not even discernible.

Ron
 

sghioto

Joined Dec 31, 2017
5,380
Another option is the Mean Well LRS series which cost about 26.00.
LRS-200-12... 12 volts at 17 amps, mounted to the panel if space allows. Voltage adjustable from 10.2 to 13.8 volts.... 8.5" x 4.5" 1.5 lbs
LRS-200-15....15 volts at 14 amps, mounted remote if insistent on using #18 wire. Voltage adjustable from 13.5 to 18 volts.
SG
 
Last edited:

Norfindel

Joined Mar 6, 2008
326
Using THIS calculator (which shows the same results as two others) shows what you said. for a 12.3A load at 12.30V, across a distance of 11 feet, I lose 1.73V and am left with 10.57V, in 18AWG. Yes, I believe that's still within a usable range for the LED strips, but it will be dim, very dim. I'd be losing over half of the potential brightness of these lamps, and given that the whole point of them is to use them for photography.... well. I need to get the voltage at the load to be as close to 12.0V as possible... so im thinking get an adjustable 15V supply, dial it up to 15.14V, and then I can get away with using even 20 feet of wire, and the voltage at the load will be 12.0, after a drop of 3.14V.
Correctly designed equipment has a voltage range to operate within specs. Are you sure that the 12v are just fed to the led strips directly, and that any drop will affect the amount of light? Do you have any kind of manual or datasheet for those things?
If you need as close as 12v as possible, then do what everyone in the world does: use the correct cable.
If you buy a higher voltage PSU, when you dim the leds, the voltage drop will be lower, and the voltage at the led strips will possibly be higher than 12v. And of course, if you raise the PSU voltage for compensate for the drop, the drop will be higher anyways, and you're wasting power. Also, the cable could heat, as the power heating the wire is nearly 21 watts.

Maybe you need to stop trying to guess what could happen, and start testing stuff. If you have a variable (lab) PSU, then set it at 12v, connect the strips, make them illuminate a dark room, and take a photo. Then lower the voltage and take another one.

I'm seeing a table of recommended cable sections for mains wiring, and they list that the maximum current for \(1.5 mm^2\) cable is 13A. That would be around AWG 15. So, you probably don't want to use anything much thinner than that.
 

MisterBill2

Joined Jan 23, 2018
18,176
I suggest an experiment of putting an ammeter in series with a single lamp strip and seeing what the actual current is as the voltage is varied from 11 volts up to 12.8 volts, possibly in 0.20 volt steps. If the strips have a protective resistor the current change will be less than if they do not have a protective resistor. The measurements will be very educational to all of us.
 

Thread Starter

-Ty-

Joined Feb 5, 2017
83
Correctly designed equipment has a voltage range to operate within specs. Are you sure that the 12v are just fed to the led strips directly, and that any drop will affect the amount of light? Do you have any kind of manual or datasheet for those things?
If you need as close as 12v as possible, then do what everyone in the world does: use the correct cable.
If you buy a higher voltage PSU, when you dim the leds, the voltage drop will be lower, and the voltage at the led strips will possibly be higher than 12v. And of course, if you raise the PSU voltage for compensate for the drop, the drop will be higher anyways, and you're wasting power. Also, the cable could heat, as the power heating the wire is nearly 21 watts.

Maybe you need to stop trying to guess what could happen, and start testing stuff. If you have a variable (lab) PSU, then set it at 12v, connect the strips, make them illuminate a dark room, and take a photo. Then lower the voltage and take another one.

I'm seeing a table of recommended cable sections for mains wiring, and they list that the maximum current for \(1.5 mm^2\) cable is 13A. That would be around AWG 15. So, you probably don't want to use anything much thinner than that.
Now, see, you're saying the opposite of what others here have been saying: that dimming the lights through PWM won't actually change the current flow through the system, since each individual pulse creates a 12.3A draw, and so, won't lead to an ever-increasing voltage at the load as I dim the lights. You're saying that the voltage WILL rise.

As for testing, If i had an adjustable PSU i would have used it already haha. I don't own one, and don't have access to one. However, i do know from experience with a voltage-dimming-based 100W LED cob that dropping the voltage below the nominal value causes the light to dim.

As for the cable spec, thank you.


I suggest an experiment of putting an ammeter in series with a single lamp strip and seeing what the actual current is as the voltage is varied from 11 volts up to 12.8 volts, possibly in 0.20 volt steps. If the strips have a protective resistor the current change will be less than if they do not have a protective resistor. The measurements will be very educational to all of us.
As i mentioned earlier, my ammeter reads 12.3A at full load, with it decreasing to very little (1 or 2 amps IIRC) as I dim the lights. HOWEVER, as others have pointed out, that's just an apparent value, given that the system is being dimmed through PWM, my multimeter is averaging the values over time, rather than reporting the discrete current draw over each "ON" pulse, which, according to the majority of people here, should be the full 12.3A, at any PWM duty cycle. Also, the LED's do all have protective resistors.
 

Thread Starter

-Ty-

Joined Feb 5, 2017
83
Another option is the Mean Well LRS series which cost about 26.00.
LRS-200-12... 12 volts at 17 amps, mounted to the panel if space allows. Voltage adjustable from 10.2 to 13.8 volts.... 8.5" x 4.5" 1.5 lbs
LRS-200-15....15 volts at 14 amps, mounted remote if insistent on using #18 wire. Voltage adjustable from 13.5 to 18 volts.
SG
Yeah, unfortunately, as much as I love how tiny and lightweight that mini RPS-200 power supply is, I think it's beyond my budget. Thank you though for giving me these two other cheaper options. Their adjustability range is higher than what I've been able to find.
 

Ya’akov

Joined Jan 27, 2019
9,072
Yes, yes, I know, don't use AC plugs for DC projects. I get it, I learned my lesson. But, there's no going back now, they can't be swapped out. Those aluminum panels were cut to fit the plugs and I can't swap it for something different unless it has the exact same shape and size.

SO, whatever power supply I get, I'll wire the output up to a C13 Computer power cord.
A suggestion on this part: Make pigtails with appropriate connectors using the C13 and epoxy them into the connectors so there is no chance of a smoke test. Don't get any of the epoxy past half of the plug body, and be sure to rough them up so they will stay.

You're right that you should not use a connector, specific to a high AC voltage, on a low voltage DC device.
 
Last edited:

Norfindel

Joined Mar 6, 2008
326
Now, see, you're saying the opposite of what others here have been saying: that dimming the lights through PWM won't actually change the current flow through the system, since each individual pulse creates a 12.3A draw, and so, won't lead to an ever-increasing voltage at the load as I dim the lights. You're saying that the voltage WILL rise.
Yes, i'm assuming that those led strips are a bunch of leds with resistors, and don't have constant current regulation. Otherwise, what's the point? Just drive them with a CC driver and then the current will be always right. Also, if they had CC regulators, then the voltage range would be quite large. Don't you have any information about that? We would like to help, but we don't have clairvoyance abilities.
 

MisterBill2

Joined Jan 23, 2018
18,176
Since the strips do not seem to have on=board drivers, PWM intensity control will work very well. The power supplies, whatever they are should have reasonable regulation, aside from that no special concerns.
If we could see a picture of the finished project that would help a bunch of folks learn.
 

Sensacell

Joined Jun 19, 2012
3,432
Some rough analysis- making some basic assumptions here.

Each LED sub-circuit...
3 X white LED's in series, (3.2 V X 3 ) = 9.6 V - Design voltage = 12 V, so resistor burden voltage = (12- 9.6 ) = 2.4 V

Design current = 0.02 A per LED, that makes the resistor about 120 Ohms. (2.4 / 0.02) = 120
To get to 12.3 Amps you must have about 615 of these little circuits in parallel - ok fine.

If you decide to use the 20' of 18 AWG wire (40' round trip) that ends up being about 0.254 Ohms.

615 each 120 ohm resistors in parallel = 0.195 ohms. Add in the wire resistance, now you have 0.449 ohms.
To get the 12.3 A, you need to have 5.52 Volts over this resistance, so the supply needs to be (5.52 + 9.6 ) = 15.12V

The dimmers located at the other end of the line will see a pulsating voltage that changes between 12V and 15V (3 volts or ripple) - Probably ok. The LED's will NOT burn out or run hot, but your wire will be burning 3V x 12.3A = 37 Watts at full blast!

That's almost 2 watts per foot, it's gonna get very warm- that's the real problem with this idea.
 

sghioto

Joined Dec 31, 2017
5,380
That's almost 2 watts per foot, it's gonna get very warm
That was mentioned back in post #17 but no one seemed interested. I guess as long as
the insulation doesn't melt it not's a problem.
As far as the LEDs are concerned we have already confirmed there are series resistors from the panel photo.
Assuming the current is limited to 20ma per LED at 12 volts.
 

MisterBill2

Joined Jan 23, 2018
18,176
If wires run "hot" it will certainly lead to the insulation deteriorating at some rate and eventually becoming brittle. It makes far more sense to use #14 wire which is twice the size and is rated as OK for 15 amps. Aside from having much less resistance it will also have more surface area to radiate the heat, a secondary benefit. If you are able to locate the #14 wire with the thinner and lighter insulation that will work out even better. But right now I am trying to imagine a setup that is that close to falling over that a bit heavier wire would cause a problem. Standard photographic light tripods are rather stable. I have seen one individual hang the power source beneath the center of the tripod, just above the floor, That made it stable in a fair breeze.
 

Thread Starter

-Ty-

Joined Feb 5, 2017
83
Hello everyone,

So, it's become clear to me that a more complete explanation of the circuit and components is needed. So, behold:

LED Panels Circuit.jpg

This is just about as much information as I can provide. There is 1 resistor wired up for each group of 3 white LEDs, and the marking on it is 390, or 39 ohms. I've emailed the ebay supplier for datasheets for the LED strips, but I doubt they'll have any to send me. If they do, I'll be sure to post them.

The circuit diagram also lists my intended supply voltage and wire sizes. As you can see, if i can find a 12 or 15V supply which can hit around 14 V (such as the LRS-200-15V , then I should get 12.0V reaching the LEDs by the end of 20' of 16AWG stranded copper wire.

Please note that the series resistor in the wire is a representation of the wire's internal resistance, and isnt a discrete resistor. Hence, the resistance of 0.08 ohms is only true at 12.37A.

All i really need to know now, is what will happen when i turn the dimmers down, reducing the duty cycle, and thus the apparent voltage and current through the circuit? Will the lessening of the current lead to a lessening of voltage drop through the cable, and thus, an over-voltage of the LEDs? Or, will the true current draw through the system stay the same for each "ON" pulse of the PWM system, and thus, even when the duty cycle is lowered, the voltage reaching the LED's will stay at 12.0 V? I have now read comments from people saying BOTH things, so I don't know who to believe.

As for power loss through the cable, keep in mind that all of this wiring will be exposed, not behind walls or permanently installed. So, I do not care about wasting electricity, or having the cable heat up. I don't even mind if it gets hot, as long as it remains safe and won't burn/ignite.

I WILL be picking up 14-gauge wire if i can find it at a decent price. I too am a fan of thicker wires.

Thanks again!
 
Last edited:

MisterBill2

Joined Jan 23, 2018
18,176
The peak current will not increase as the duty cycle is reduced if the supply has good regulation. If the supply is not well regulated then the voltage could rise as the duty cycle drops. BUT if the voltage rises as the duty cycle drops, it will effectively reduce the dimming effect. So you need to do the experiment of connecting the dimmer to the lights and power supply, after having set the supply voltage to 12 volts, or 11 volts if that adjustment is available, and then measure the supply voltage at the supply terminals as the brightness is adjusted with the PWM controller. That will tell you how well the power supply regulates the voltage. If the regulation is adequate the voltage reading at the supply terminals should not change. That will demonstrate that the supply you have will work as required. And you will have seen the proof yourself.
 

Norfindel

Joined Mar 6, 2008
326
Even if the voltage is stable, i still think that it's a bad idea to try to compensate for the voltage drop with extra voltage on the psu. An adjustable supply, or one with a "weird" voltage is likely to cost more than the typical 12v psu. That money could be used to buy the correct cable, and problem solved. That's a better solution than overloading a thinner cable.
 
Top