Hello, I'm working on a project where I have a 5V power supply that is powered up at all times. I am wanting to test a device multiple times (hot swapping) without cutting power because it takes too much time to dissipate power for the process I require. What is happening is that we are creating inrush to the power supply because we are plugging in with power.
While talking to others, they came up with putting a resistor in series with the 5V supply to reduce current. What I don't understand is they also talked about using a capacitor and a mosfet to short out the resistor after around 100ms. I don't want the answer on how to do the project because that's not how you learn, but I'm having trouble understanding how something like that works.
Can anyone give me some hints as to what the theory might be behind this?
While talking to others, they came up with putting a resistor in series with the 5V supply to reduce current. What I don't understand is they also talked about using a capacitor and a mosfet to short out the resistor after around 100ms. I don't want the answer on how to do the project because that's not how you learn, but I'm having trouble understanding how something like that works.
Can anyone give me some hints as to what the theory might be behind this?