PWM current vs DC current

Thread Starter

Topad

Joined Nov 30, 2021
15
Hi, how the real power heating current will be a for a pwm signal will be mesured in average? TrueVrms? Or will be the same like normal current. The singal is 0 to 5v. Not negative value, so is more like pulseing dc. For example, two led vf=2v connected to 9v 250ohms with a frequency of 10%hz at 50% duty cycle. What would be the true current? Another examples is when I have a buzzer about 2khz at 50% when mesured direct normal dc current is 120ma, average is about 80ma. Thank you.
 

Ian0

Joined Aug 7, 2020
10,001
Voltage * Current * duty cycle gives the average power (assuming that the current and the voltage don't change during the on period).
If the load is resistive then the power is (V^2/R) * d
You can work back and get the true rms voltage and you'll find that it is V√d
 

LowQCab

Joined Nov 6, 2012
4,191
The LED-Current does not change.

The amount of Heat generated by the LED will be reduced by the PWM%,
this "may" allow running the LED closer to it's maximum-rated Current,
that is if around say ...... ~80% to ~90% PWM is never exceeded.

Increasing the Current will either shorten the life of the LED, or possibly smoke it instantly.

There's no such thing as a "Free-Lunch".
.
.
.
 

MisterBill2

Joined Jan 23, 2018
18,934
LQC is correct! The on state current is the same, the total heating os proportional to the duty cycle on time, up to the point of breakdown. So with very short pulses the current can be greater. But then at some point it falls apart as other damage mechanisms come in to play.
 

Thread Starter

Topad

Joined Nov 30, 2021
15
Thank you guys. So the heating current will be the same even at pwm 50%. Is not like 20ma will be reduce to 10ma or average. It just going on and off 0ma to 20ma but the heating will be less since the off and on, but the life of the led will be probably slightly better then continous.
 

Ian0

Joined Aug 7, 2020
10,001
Thank you guys. So the heating current will be the same even at pwm 50%. Is not like 20ma will be reduce to 10ma or average. It just going on and off 0ma to 20ma but the heating will be less since the off and on, but the life of the led will be probably slightly better then continous.
It depends on the PWM frequency and the thermal time constant. At high frequencies, 20mA @ 50% duty cycle will exactly the same as 10mA continuous. If the LED has chance to warm up on each pulse, the then it has to be treated as 20mA
 

Thread Starter

Topad

Joined Nov 30, 2021
15
Thank you, at what frequency will the current start to be like 10ma continuous. I had them at 7hz which is low 50%. I also had another led about 110 hz led going from 0 to 100% pwm duty cycle then back from 100% to 0%. in 5% steps. My question is how to treat current on these cases when pwm and frequency is involved. I like to run some leds at close to is max about 18ma. The leds were tested at vf =2v at 20ma with a max current of 30ma.
 

LowQCab

Joined Nov 6, 2012
4,191
You should operate by assuming that ........
PWM Frequency, or Percentage-of-On-Time,
has NOTHING to do with the Current that the LED is operating at.
ZERO, Nada, Zip, don't even consider it as a remotely related factor.
Nothing will change the maximum allowable LED-Current.

Unless You can accurately measure and continuously monitor the Temperature of your LEDs,
the concept of less Heat generated because of a reduced PWM % is only an academic observation
and has no practical application in a Hobby-Project-Environment.

If You run your LEDs right at, or exceeding, their maximum-rated-Current,
AT ANY TIME, or at any PWM PERCENTAGE,
You will shorten their Life-Expectancy, or possibly pop them instantly.

If You need to run your LEDs at, or very near, their Maximum-Rated-Current, You need more powerful LEDs.

Why are You running your LEDs at "Maximum-Rated-Current",
and then reducing their Light-Output by way of PWM-Dimming ?????

The Voltage across the LED is completely irrelevant,
as long as You have enough Voltage to cause the amount of CURRENT to to flow that You want.
LEDs care about CURRENT, You generally don't need to know the Voltage across the LED,
except when calculating the value of a Current-Limiting-Resistor.
( unless You are running them on a Single-Cell Battery, or stacking multiple LEDs in series ).

If your LEDs are actually rated for ~30mA max,
the best plan is to use a Current-Regulator, instead of a Resistor, like a CL25N3-G-ND .

The Current can also be Regulated by using a standard
generic Voltage-Regulator-Chip configured as a Current-Regulator.

Either of the above solutions will keep the LED-Current consistent
regardless of any Power-Supply Voltage-fluctuations.

PWM-Dimming can still be used with either of the above solutions.
.
.
.
 

MisterBill2

Joined Jan 23, 2018
18,934
In addition to what LQC has said, LEDs will produce light at currents and voltages well below the stated maximum rating. But there are variables and so even among red LEDs there is a fair amount of difference between part numbers. 20 milliamps was specified for a specific light output for a given half-output lifetime, not as a mandatory drive current.
 

Thread Starter

Topad

Joined Nov 30, 2021
15
Leds are rated for a absolute maximum of 30ma and were tested at 20ma. Which is the upper safe limit with life expectancy. I like to run them close to this limit let say 17-18ma to get the most light brightness, but at the same time don't go over the 20ma with a simple limit resistor.
 
Top