# Current Limiting Techniques

Discussion in 'General Electronics Chat' started by amateur, May 3, 2007.

1. ### amateur Thread Starter New Member

May 3, 2007
2
0
Hi,

I have built a circuit that uses a AC-DC plug-in converter as the power supply. When I plug in the device that my circuit outputs to the voltage supplied from the converter reduces. I think that I am drawing too much current. I dont have much experience in this and I was wondering if there is any obvious way I can reduce the current drawn by my circuit? Thanks for any help.

2. ### mozikluv AAC Fanatic!

Jan 22, 2004
1,437
1
as you have already mentioned that it seems that your circuit (load) is drawing more than what your supply can give. you must remember that the load (your circuit) will only draw what it needs. one solution to solve it is to have a higher current source, which is the easier way. or redo your circuit for it to draw lesser current.

you did not mention what device you have. in fact your presentation is so vague. pls be more expound further.

moz

3. ### amateur Thread Starter New Member

May 3, 2007
2
0
Hi, thanks for the reply. Sorry for the vagueness, the device is an LNB for a VSAT, the circuit supplies a 12/18V switching signals to the LNB via a coaxial cable. It uses a MAX1771, 12V or Adjustable, High-Efficiency, Low IQ, Step-Up DC-DC Controller to supply the 12/18V voltage. The circuit works fine at 12V but when I switch it to 18V the power supply reduces its voltage. Is resistance of the circuit the main thing that determines the drawn current?

4. ### nomurphy AAC Fanatic!

Aug 8, 2005
567
12
Generally, it's the nature of the beast. When the voltage is stepped-up from 12V to 18V, then less current is allowed/provided (lower V, more I; higher V, less I). If you are drawing more current than the supply is designed for, then yes, very likely you may experience voltage droop.