Power regulator that decreases when power applied

Thread Starter

AZJoe78

Joined Feb 14, 2024
1
Good day.

Here's what I'm trying to design: there will be a power in / power out and a trigger power. This will be used to automatically dim LEDs.

There will be constant power on the power in line (12-14V DC). When there is 0 volts on the trigger line the power out is either unaffected or regulated to 10V or more. When the switch is thrown and there is the same 12-14V DC applied to the trigger line, the power out is reduced to 3-5VDC.

How do I do it?
 

ElectricSpidey

Joined Dec 2, 2017
3,312
Sounds like something you could do with a digital controlled voltage regulator.

You would need the regulator (LM317) a NPN transistor and some resistors. (and some caps)

The idea is to use the transistor to augment the voltage divider circuit on the regulator when you turn on the transistor.

1707929588404.png

In the example above you would only use one transistor.
 
Last edited:

ElectricSpidey

Joined Dec 2, 2017
3,312
If more current than a 317 can provide is needed you can add a series pass transistor.

Also, an alternative circuit could be placing a power transistor in the emitter follower configuration and using a second transistor to augment the voltage divider.

That arrangement could provide the source voltage minus a diode drop. (aprox)
 

crutschow

Joined Mar 14, 2008
38,325
Below is the LTspice sim of Spidey's circuit with one transistor:
The selected values for R1 and R4 give an output of 10V (green trace) when the control signal (ctrl, yellow trace), is low, and 4V when the ctl signal is high.

1707935909851.png
 
Top