RaspberryPi driving a LED through a mosfet

Thread Starter


Joined Jan 19, 2018
Hi everyone,

For a job project I need to control a LED with a RaspberryPi!

The LED is a white one and it has to be PWM driven in order to make it show a sort of "breathing effect".

Unfortunately I don't have the LED datasheet so I've empirically (using my bench power supply) found that the LED forward voltage is 3V appox. and that the brigthness I want is obtained with 50mA approx.

So I can't drive the LED attaching it directly to a GPIO pin: RaspberryPi is a 3.3V logic and I've read indeed that a GPIO can give a maximum of 16mA to them :(
I had to find another way.

I eventually come up with a circuit like the one shown in attachment.

Where I'm usign an IRL540n mosfet (link) to drive the LED through the Raspberry 5V power supply pin.
The mosfet gate is attached to the GPIO throught a resistor to reduce the (theoretically minimal already) current required from the GPIO.
I've also placed a pull down resistor in order for the gate charge to discharge to ground when GPIO is LOW.

These R1 and R2 have respectively 10K and 100K Ohm values (I put them empirically).

I've choosed a 2.2 Ohm R3 resistor for the LED, this way I've found current values are pretty near to the one I want.

Now I have created 4 copies of the circuit mounted on a RaspberryPi Adafruit Perma-Proto HATs (link).

Now I'm experiencing different behaviours for each of them. While two of them are working pretty the way I wanted to be, the others not.

  1. in one case the LED has been burnt after beeing working for a while;
  2. the other last one showed a strange flickering pattern (not related to the PWM). It worked like that for a day then stop working at all.
I'm supposing that in the first case maybe the current through the LED was too much higher than expected (maybe because of resistors, transistors tolerances?)
In the second situation I'm not able to figure out what the problem can be. It seems a transistor problem but I not sure (I have only a limited knowledge of electronics).

What I'm doing wrong?
Someone already has some experience in setups like that?
Any help will be much appreciated

Nicola Ariutti



Joined Mar 30, 2015
Welcome to AAC!

I suspect that LED operating current is too high. You killed one LED and you're making the bond wire on the other do something that's related to heat. This is a common problem with low quality LEDs. I have a cheap LED flashlight and one of the LEDs starts blinking after the LED warms up. Letting it cool restores proper operation; until it heats up again.

Unless this is some sort of high power LED, a constant 50mA current seems excessive. Your 2.2 ohm resistor gives a current of about 900mA. Whether that is acceptable depends on the peak current spec for the LED (which you don't have), and the max power dissipation spec (which you also don't have). You need to use more conservative values for an unknown LED; otherwise, you risk burning them out.

You'll get better answers if you tell us what the duty cycle is when you're burning things up.


Joined Mar 10, 2018
In order to get the fet on you have chosen, hard on (no pun intended), you
should drive it Vgs 3.5V or higher, to get Rdson low.

So the input Gate R2 should be a few K, and R1 100 ohms or so. To min
the V loss due to divider action. R1 is to decouple Gate C from UP, but downside
slows down turnon of the MOSFET. R2 is for when UP is off, to keep MOSFET off
when gate pin would otherwise be effectively floating when UP off or pin in tri-state.

Regards, Dana.

-live wire-

Joined Dec 22, 2017
You are probably operating the MOSFET in the ohmic region, which is not desirable. You can use a BJT to drive the LED and completely eliminate this issue, or you can have a BJT drive the mosfet, where the BJT connects it to a high enough voltage. But given the small amount of current the LED needs, just a BJT makes the most sense.

Also, what are the dimensions of the LED? If it looks like a standard 5mm one, it probably can only take 20mA.
Last edited:


Joined Feb 8, 2018
The maximum threshold voltage for that FET is 2.0 V and the minimum transconductance is 12 S, so 3.3 V is more than enough for the required current. At that current, the smallest power MOSFET available or any of a whole range of 5 cent NPN bipolar transistors (e.g. PN2222A, 2N3904, 2N4401, MPS-A06) would work. If you use something like a 2N3904, a base current limiting resistor of around 470 ohms would be appropriate for hard saturation, but 1k would be adequate. The FET you are using is OK, but capable of switching many amperes and costs a lot more than is necessary for the application. If you use a PWM frequency in the low kilohertz range the switching slew rate, which is influenced by the resistor in series with the gate (due to gate capacitance), will be irrelevant. I'd probably use about 1k, but twice that or half that would be fine. 10k is a bit on the high side unless the PWM frequency is in the range of a few hundred hertz. 100k is fine for the pull-down.

With either the FET or BJT a current limiting resistor of about 35 to 40 ohms for the LED will yield about 50 mA with a 5 V supply.


Joined Feb 20, 2016
50mA may well be too high for a common LED, particularly as you say you don't know what the LED is.
20mA is probably a better max to aim for.
So try a 150R LED resistor to get around 20mA and see how that goes.

Thread Starter


Joined Jan 19, 2018
Thank you all for your reply,

unfortunately I was not able to find the duty cycle of my PWM nor to change the transistor but I've placed a much higher resistor (an 820 Ohm, the only ones I had at the contruction site at that moment) in series with the LED and now the circuit seems to work well.

Thank you all @dl324, @danadak, @-live wire-, @ebp, @dendad for your support and the useful information you gave me.
beste regards
Nicola Ariutti