Would LEDs reduce my minimum safe MOSFET voltage?

Thread Starter

LikeTheSandwich

Joined Feb 22, 2021
206
Let's say I have an LED strip with a forward voltage of 90V, and I'm powering it with 120V. To control it with a MOSFET, would that mosfet technically only need to handle 30V? Or in order to ensure I'm not damaging the MOSFET, does the MOSFET need to be rated for 120V?
 

crutschow

Joined Mar 14, 2008
38,318
Let's say I have an LED strip with a forward voltage of 90V, and I'm powering it with 120V. To control it with a MOSFET, would that mosfet technically only need to handle 30V? Or in order to ensure I'm not damaging the MOSFET, does the MOSFET need to be rated for 120V?
If it's 120Vdc, then the MOSFET should be rated for at least 150V for safety margin, since the LEDs have a negligible voltage drop when not conducting current.
 

Thread Starter

LikeTheSandwich

Joined Feb 22, 2021
206
If it's 120Vdc, then the MOSFET should be rated for at least 150V for safety margin, since the LEDs have a negligible voltage drop when not conducting current.
They have a negligible voltage drop when not conducting current? Interesting. So if the total forward voltage is 90VDC, and I fed it with 80VDC, the MOSFET would experience 80? But at 120, when the LEDs are conducting, the MOSFET would only experience 30V?
 

crutschow

Joined Mar 14, 2008
38,318
They have a negligible voltage drop when not conducting current?
They have essentially zero voltage drop.
You can only have a voltage drop when something is carrying current.
If there is no current then there is no voltage drop.
So if the total forward voltage is 90VDC, and I fed it with 80VDC, the MOSFET would experience 80?
Yes, if the MOSFET is OFF.
But at 120, when the LEDs are conducting, the MOSFET would only experience 30V?
No.
If the LEDs are conducting, then the MOSFET is ON and would show only a voltage drop equal to its ON-resistance times the LED current.
The 30V would appear across the dropping resistor (or whatever circuit is regulating the LED current).
 
Last edited:

Ian0

Joined Aug 7, 2020
13,097
Let's say I have an LED strip with a forward voltage of 90V, and I'm powering it with 120V. To control it with a MOSFET, would that mosfet technically only need to handle 30V? Or in order to ensure I'm not damaging the MOSFET, does the MOSFET need to be rated for 120V?
In theory, yes you could. But in practice:
The forward voltage of the LEDs is their voltage drop at a certain current, NOT the voltage at which they start to conduct. There will be a certain leakage current at nearly any voltage, and that means that the full supply voltage will be across the MOSFET, albeit at a small current. Many MOSFETs are avalanche rated, but whilst you might get away with it, it is firmly in the "pushing your luck" category.
If the LEDs were zeners, then you might stand a better chance.
If you design everything as though you were making 100 and would have to pay postage on all the customer returns, then how would you design it?

IS that 120V the rms AC voltage before it is rectified? If so, then it needs to be rated for at least 180V (120V * √2 plus 10% tolerance on the supply voltage)
 

MisterBill2

Joined Jan 23, 2018
27,180
In a series circuit, when the LEDs are not yet biased into conduction, they act like a resistor of some fairly high value, in addition to having some very small diode effect.
So"C" in post #5 is correct, in that the actual voltage across the switch may reach the peak voltage of the wave form. It would be in series with a high resistance, and so the results will not be accurately predictable.
So I suggest not pushing your luck. If the mosfet has an over-voltage breakdown to a shorted condition then you will have the full 120 volts RMS across the LED string, at some current limited only by the circuit resistance. Until one or more LEDs fail open-circuit.
Probably not the preferred result.
 
Top