Detect sudden voltage drop

Thread Starter

wiwah

Joined Jul 27, 2011
3
Hi Everyone,
I am new to this forum so please if this had already been answered or I am posting in the wrong place let me know.

I am trying to design a circuit that will detect a sudden DC voltage drop. i.e. 1v drop to 0.9v. My problem is that the input voltage is not fixed (anywhere between 0.8v - 1.6v) however regardless of what the input voltage is at any given time I want to be able to detect if the voltage has dropped by at least 0.1v. I cannot use a microcontroller for this at the moment. I was considering using some sort of delay and then comparing the outputs. However I would be interested in hearing any tips or ideas you may have.

Thanks very much,

wiwah
 

debjit625

Joined Apr 17, 2010
790
Two things could be possible.
1)Provide a fixed reference voltage from that you could detect voltage drop, you can't say it will be anywhere between 0.8v - 1.6v and detect for 0.1 volt drop.
Using a fixed reference voltage it can be done, simply using operational amplifiers as comparator.

2) Providing a fixed time i.e.. you detect 0.1v drop anywhere from 0.8v - 1.6v at a fixed rate. This also could be done using operational amplifiers but a mcu is the most better option.

Good Luck
 

t_n_k

Joined Mar 6, 2009
5,455
Depends on how rapidly the longer term variations (in the 0.8V to 1.6V region) occur with respect to the "sudden change" of 0.1V you need to to detect.

One method might involve using a long-term averaging circuit in conjunction with a comparator circuit fed by this longer term average signal and the unmodified input signal.
 

Thread Starter

wiwah

Joined Jul 27, 2011
3
Hey, thanks so much for the quick replies.
I will research what you all have suggested and let you know how I get on.

Thanks again.
 

ErnieM

Joined Apr 24, 2011
8,377
Here's one way:



The - input sees your signal directly.

The + input sees a slightly reduced input (by the resistor divider ratio) version of the input, and will hold the steady state value for a time determined by C1.

Normally the - input is >> then the + input so the output is low.

When the input changes negatively by some significant amount, the - in becomes << the + input so the output changes high.

As the input changes positively the - input is >> then the + input (+ lags from the cap) so the output is low.
 

t_n_k

Joined Mar 6, 2009
5,455
Nice idea - much simpler than what I suggested.

Can you get the discrimination of value changes over the full input range of 0.8 to 1.6V? What about false responses where the momentary drop is less than 0.1V? You might get a broad spread in switching threshold.

Might need to include some hysteresis perhaps.

I'll try playing with some simulations to see how it goes. Or have you already done that and come up with a reliable solution?
 

t_n_k

Joined Mar 6, 2009
5,455
Suppose one had the two following inputs. The first has 100mV transient changes superimposed on a slowly varying signal in the 0.8 to 1.6V range while the second has only 75mV transient changes on a similar slowly varying signal. Could a suitable implementation of the circuit differentiate the two cases?
 

Attachments

THE_RB

Joined Feb 11, 2008
5,438
A similar thing was commonly done in televisions using a single transistor.

Bias a NPN transistor just on using a high value resistor (1 meg?) to +5v. Then capacitor couple the signal to it's base using a small cap.

Any significant negative going spike will cause the transistor to turn off for a short duration.
 

Thread Starter

wiwah

Joined Jul 27, 2011
3
Hey Everyone,

Well what I went with was just a simple RC circuit on the input signal (delay approx 1s). I then used a difference amp to compare the original and delayed signal. Then used a comparator to check if the difference was greater than the required threshold voltage. With basic tests it seems to perform ok. Thanks for the suggestions and if anyone knows of a better way or just suggestions to improve the circuit I'd love to hear them.
 

ErnieM

Joined Apr 24, 2011
8,377
Well what I went with was just a simple RC circuit on the input signal (delay approx 1s). I then used a difference amp to compare the original and delayed signal. Then used a comparator to check if the difference was greater than the required threshold voltage. With basic tests it seems to perform ok. Thanks for the suggestions and if anyone knows of a better way or just suggestions to improve the circuit I'd love to hear them.
That should work, it gives you a fixed reference to compare the difference to. My version the comparison will change when over the input range, and if the 0.1 volt level is critical you'll need that second stage.
 
Top