Hi,
I have returned to an old issue I am struggling with – my door lock solenoid project again.
I have created a really simple LTSpice circuit driving my solenoid at 1Khz, 50% duty. The solenoid I measured with an inductance meter, 35mH and 25R series resistance, this has been entered into the inductor parameters.
I did not add any parallel resistance, because to be honest I don’t understand this.
I am measuring the current through the 0.1R resistor.
If I drive the MOSFET at 100%. The current through the 0.1R is as expected, 480mA. When I go the PWM drive, I expect to see a square wave of 0-480mA 1Khz 50%.
But, that’s not what I see, I am seeing an averaged version of this, as in the attached “Coil2” image. This shows 180mA.
So, I am trying to understand the physics of exactly what is happening here – I have assume the inductance of the coil, along with the various resistances means that the rate of change within the coil is creating this effect.
From a previous post, I see that the time constant will be
t=L/R, so t=0.035/25= 0.0014s =1.4mS
The questions I am trying to answer/understand is:-
What does this time constant actually mean – is this the slope I see when the signal is first applied, up to 63%, is this the current building in coil due to the inductance/resistance and then the average current will be set by the frequency applied / the rate of change?
What will the actual current through the coil be, 480mA pulses, or as seen across the resistor?
Does this mean that the average current seen by the power supply will be even lower than my expected 50%, because if we are starting with 180mA and not 480mA, 50% could be as low as 90mA. (which is approximately what I see if I add a CLC filter onto this circuit).
If I actually build this and measure the voltage across the resistor, I see square waves of approx. 25mV, so 250mA which I guess could be close as the simulation does not take account of the power supply etc.
It’s the fundamental process I need to understand, if anyone can help.


I have returned to an old issue I am struggling with – my door lock solenoid project again.
I have created a really simple LTSpice circuit driving my solenoid at 1Khz, 50% duty. The solenoid I measured with an inductance meter, 35mH and 25R series resistance, this has been entered into the inductor parameters.
I did not add any parallel resistance, because to be honest I don’t understand this.
I am measuring the current through the 0.1R resistor.
If I drive the MOSFET at 100%. The current through the 0.1R is as expected, 480mA. When I go the PWM drive, I expect to see a square wave of 0-480mA 1Khz 50%.
But, that’s not what I see, I am seeing an averaged version of this, as in the attached “Coil2” image. This shows 180mA.
So, I am trying to understand the physics of exactly what is happening here – I have assume the inductance of the coil, along with the various resistances means that the rate of change within the coil is creating this effect.
From a previous post, I see that the time constant will be
t=L/R, so t=0.035/25= 0.0014s =1.4mS
The questions I am trying to answer/understand is:-
What does this time constant actually mean – is this the slope I see when the signal is first applied, up to 63%, is this the current building in coil due to the inductance/resistance and then the average current will be set by the frequency applied / the rate of change?
What will the actual current through the coil be, 480mA pulses, or as seen across the resistor?
Does this mean that the average current seen by the power supply will be even lower than my expected 50%, because if we are starting with 180mA and not 480mA, 50% could be as low as 90mA. (which is approximately what I see if I add a CLC filter onto this circuit).
If I actually build this and measure the voltage across the resistor, I see square waves of approx. 25mV, so 250mA which I guess could be close as the simulation does not take account of the power supply etc.
It’s the fundamental process I need to understand, if anyone can help.

