Current Limiting Circuit

Discussion in 'The Projects Forum' started by redalertcavy, Nov 2, 2015.

  1. redalertcavy

    Thread Starter New Member

    Nov 2, 2015
    1
    0
    Hello, I'm working on a project where I have a 5V power supply that is powered up at all times. I am wanting to test a device multiple times (hot swapping) without cutting power because it takes too much time to dissipate power for the process I require. What is happening is that we are creating inrush to the power supply because we are plugging in with power.

    While talking to others, they came up with putting a resistor in series with the 5V supply to reduce current. What I don't understand is they also talked about using a capacitor and a mosfet to short out the resistor after around 100ms. I don't want the answer on how to do the project because that's not how you learn, but I'm having trouble understanding how something like that works.

    Can anyone give me some hints as to what the theory might be behind this?
     
  2. DickCappels

    Moderator

    Aug 21, 2008
    2,658
    632
    The MOSFET source and drain can be used as a switch for one polarity (it looks like a diode to the opposite polarity). The switch gradually turns on as the gate-to-source voltage increases. The rate of increasing gate-source voltage can be controlled with a resistor and a capacitor...
     
  3. dl324

    Distinguished Member

    Mar 30, 2015
    3,249
    626
    Kudos to you!
    If you haven't already done so, draw the circuit. Determine the effect the resistor has; both pros and cons. Once you've identified the cons, you should understand why it was suggested that you short out the resistor after 100mS.

    Then, for extra credit, see if you can find other ways to do it.
     
Loading...