Help me apply Ohm's Law to reduce brightness of an LED lamp

Thread Starter

OrlyP

Joined Sep 8, 2021
3
Hi, I'm new here and still trying to get a feel of the site.

I have a seemingly simple problem. I bought a few "retro Edison" E14 LED bulbs rated for 2W @ 220V. However, they seem to have a very short lifespan and I would very much want to reduce the amount of voltage fed into them. As a side preference, I would also want these to glow slightly less bright. As these are non-dimmable LEDs, using a standard dimmer is probably out of the question.

Our mains sits at 230V on average.

Using a power meter, this is what I gathered when powering one of these bulbs:

Input: 230V / 60Hz
Power: 2W
Current: 0.042A
PF: 0.2

Now, is it possible to use a resistor in series so that the LED will see a lower voltage? What would be the formula for R (resistance value and power rating)?

Thank you.
 

crutschow

Joined Mar 14, 2008
28,204
What would be the formula for R (resistance value and power rating)?
The desired resistance would be approximately R = V / .042A where V is the desired reduction in voltage (e.g. 10V if you wanted to drop the 230V to 220V).
The resistor power dissipated would be approximately V * 0.042A.
The resistor power rating should be no less than twice the value calculated.

If the power dissipated gets too high for the amount of voltage reduction you want, you might use a film capacitor in series with the bulb, which will dissipate no power.
I can help with determining the capacitor value if you post how much voltage reduction you want.
 

Sensacell

Joined Jun 19, 2012
2,919
Those LED bulbs typically have a low power factor- which means they do not draw power in a linear way, they typically draw narrow spikes of current, which makes your resistor selection a bit tricky.

Start with ohms law to get you in the ballpark, but you will need to experiment, buy a bunch of resistors so you can play with it and see how it really works.
 

Ian0

Joined Aug 7, 2020
3,765
First of all, I suggest you dismantle a dead one to see how it is powered.
If PF=0.2, then you will probably find four diodes, a couple of small resistors, an electrolytic capacitor and a large film capacitor probably class x.
The class X capacitor (C1) is what determines the power. The current flowing in the LEDs is approximately equal to the mains voltage divided by the reactance of the capacitor.*
Increase the capacitor reactance by putting another capacitor in series and you can reduce the power to the lamp.
You can add resistance as suggested above, but it makes things that much more complicated. The current is then
\(
I = \frac{V}{\sqrt {R+\frac{1}{ 2 \pi f C}}}
\)
Untitled 1.jpg
But whatever you do, DON'T try to dim it with a standard phase-fired light dimmer. That would be the quickest way to make it fail.

Of the other components: R2 limits the inrush current as C1 charges. R1 discharges C1 so you don't get a shock from the lamp when it has been removed. C2 smoothes the output voltage. D5 represents all the LEDs.

Now, if you dismantle it and find a small switched mode supply, then all bets are off, as the switched-mode supply regulates the current.

*It isn't quite this simple, but unless there are a lot of LEDs in series, it is close enough.
 
Last edited:

Yaakov

Joined Jan 27, 2019
3,755
The definitive resource for stuff like this is Big Clive. His YouTube channel has many videos relevant to this topic but I thnk this one might be particularly helpful. He is also accessible and spending a couple of pounds on a Patreon sponsorship gets you direct messaging to him which he answers pretty quickly.

 

Thread Starter

OrlyP

Joined Sep 8, 2021
3
The desired resistance would be approximately R = V / .042A where V is the desired reduction in voltage (e.g. 10V if you wanted to drop the 230V to 220V).
The resistor power dissipated would be approximately V * 0.042A.
The resistor power rating should be no less than twice the value calculated.

If the power dissipated gets too high for the amount of voltage reduction you want, you might use a film capacitor in series with the bulb, which will dissipate no power.
I can help with determining the capacitor value if you post how much voltage reduction you want.
You reminded me that I had a bunch of unused capacitors and resistors that came with my Broadlink smart switches. They were meant to be connected in parallel with LED light fixtures so they don't flicker when switched off. Mine never did flicker so I didn't use them.



Anyway, I tried both the capacitor and resistor in series and eventually settled on the latter. With the resistor connected in series, the LED bulb is getting a little over 80V. Surprisingly, the LED still lights up and the amount of brightness is sufficient for my use case.

Thank you everyone!
 

Thread Starter

OrlyP

Joined Sep 8, 2021
3
Seems more like you used trial and error than applying Ohm's law!
Lol.... Yes. And apologies for taking the quick way out.

I knew the parts that I had were meant to be connected shunt with 230V so I thought, what can go wrong if I put either in series with an LED load? The resistor in particular is overkill but they're the ones I already have so I just didn't want to complicate things any further.
 
Top