Controlling High Power LED with Logic Level nMOS

Thread Starter

Chris Wu

Joined Jul 2, 2017
6
Hi there,

I am trying to control an LED with Vf = 6.25V, with an Arduino (5V logic).
My solution is to use an nMOS transistor to simply switch it on and off. However, I'm confused about whether or not I still need a current-limiting resistor for the LED, which I want to operate at 150 mA.

I know the equation for saturation current through an nMOS is Id = (K/2)(Vgs-Vt)^2, where K is a constant dependent on specific mosfet parameters. Thus, the mosfet doesn't act like an ideal switch (open/closed with zero resistance). So to get my desired 150 mA, should I just buy a MOSFET with 150mA Id for Vg = 5V?

Thanks
 

#12

Joined Nov 30, 2010
18,224
No.
Mosfet specs are not that reliable. This would be much more predictable with a resistor in series with the LED and the mosfet. Choose the mosfet resistance from drain to source being insignificant compared to the resistor.
 

Thread Starter

Chris Wu

Joined Jul 2, 2017
6
No.
Mosfet specs are not that reliable. This would be much more predictable with a resistor in series with the LED and the mosfet. Choose the mosfet resistance from drain to source being insignificant compared to the resistor.
Two followup questions:
1) So the mosfet when on can be modeled simply as a resistor? Not a voltage-controlled current source?
2) I'm looking at this data sheet: https://cdn.sparkfun.com/datasheets/E-Textiles/Other/FDS6630A.pdf, where at the top of the first page it reports Rds(on) at 30V, 6.5A for a given Vgs. So does this mean that at Vgs = 4.5V, there will always be 6.5A flowing through the mosfet? Or that you can have any amount of current up to 6.5A?

Thanks for the help
 

#12

Joined Nov 30, 2010
18,224
You can use a mosfet for a voltage controlled resistor, but the idea that you're going to find two resistor values, connect them as a voltage divider, and have a predictable resistor from drain to source is magical thinking. The range of gate voltages (across a dozen mosfets of the same part number) for a particular Rds and the variation of the gate turn-on voltage with temperature make this application a Fantasyland adventure.

If you want to use a mosfet for a constant current source or a fixed resistor, you need to use an op-amp to adjust the gate voltage as temperature changes and to detect the exactly correct gate voltage for each mosfet you plug in. (Why you want to make this so difficult when you can just buy a resistor?)

So does this mean that at Vgs = 4.5V, there will always be 6.5A flowing through the mosfet?
Of course not. If you have a 12 volt supply and a 12k resistor in series with the mosfet, you can place the gate at +4.5 volts or +10 volts and you will never get more than a milliamp. You can give that particular mosfet 30 volts (as advertised) and +4.5V on the gate, and all you will get is smoke as it tries to dissipate 195 watts.

There are many limitations and uncertainties in a mosfet, and ALL of them must be obeyed and/or compensated for.
 

Thread Starter

Chris Wu

Joined Jul 2, 2017
6
So from what I'm interpreting, if you have a 12v source, current-limiting resistor, LED, and mosfet in series, you can assume for current-lim resistor calculations that Vds of the mosfet is 0v (for negligible Rds)?
 

dendad

Joined Feb 20, 2016
4,451
If your FET has the current handling capabilities, and you apply sufficient gate drive, for most applications you can treat the FET as a simple switch, sort off.
Using a logic level FET of a few amps capability, you can just about ignore the FET resistance and just calculate the series current limit resistor depending in your supply volts. Make sure too to look at the resistor's power handling as well.
I'm a firm believer in FET overkill, so use a much higher current FET than you really need. Most of the FETs I use have 10Amp rating or more. It did help that I got 1K reel of them for 4 Cents each some time ago :)
 

Thread Starter

Chris Wu

Joined Jul 2, 2017
6
Thanks for all the help everyone! Makes sense that you can treat power mosfets are basically ideal switches.

Just for completeness, however, I was wondering what the difference is between power mosfets for switching and mosfets used in topologies like common-source amplifier:

upload_2017-7-2_19-54-27.png
For a common-source stage, you cannot assume Vds = 0 (that'd defeat the purpose of the amplifier). Rather, my understanding for analyzing this circuit is to first determine the fet's mode of operation (triode or sat), then the current Id through the drain, and finally Vout (Vout = V+ - Id*Rd). In other words, Id is set by the mosfet, and from there you compute other voltage values.

The common-source setup is basically the same as the nmos switching circuit, yet for the latter you can assume Vds = 0, and the current is dependent on the series resistor and supply voltage. What is the difference between these two scenarios, and why can you only use Vds = 0 approximation for the second?
 

dendad

Joined Feb 20, 2016
4,451
You can use the FET as an amplifier OR a switch.
To control the LED, you want a switch so you ensure the gate drive is enough to turn the FET on, and have minimum voltage drop (and power wasted) across the FET.
If you wanted the FET to run as an amplifier, the gate drive is set just enough to run the FET as a variable resistor. BUT the FET will dissipate heat and for any real power output, a heat sink will be needed.
The same FET can be a switch OR an amplifier, depending how it is driven.
Some FETs may be optimized for switching but they are basically the same.
 

#12

Joined Nov 30, 2010
18,224
That "schematic' is almost completely useless.
I could make that a Class A audio amplifier if I drove the gate with a DC coupled negative feedback to an op-amp, but if you think you're going to set the quiescent point with 2 resistors, see post #4
 
Top