# Detect sudden voltage drop

Discussion in 'The Projects Forum' started by wiwah, Jul 27, 2011.

1. ### wiwah Thread Starter New Member

Jul 27, 2011
3
0
Hi Everyone,
I am new to this forum so please if this had already been answered or I am posting in the wrong place let me know.

I am trying to design a circuit that will detect a sudden DC voltage drop. i.e. 1v drop to 0.9v. My problem is that the input voltage is not fixed (anywhere between 0.8v - 1.6v) however regardless of what the input voltage is at any given time I want to be able to detect if the voltage has dropped by at least 0.1v. I cannot use a microcontroller for this at the moment. I was considering using some sort of delay and then comparing the outputs. However I would be interested in hearing any tips or ideas you may have.

Thanks very much,

wiwah

2. ### debjit625 Well-Known Member

Apr 17, 2010
790
186
Two things could be possible.
1)Provide a fixed reference voltage from that you could detect voltage drop, you can't say it will be anywhere between 0.8v - 1.6v and detect for 0.1 volt drop.
Using a fixed reference voltage it can be done, simply using operational amplifiers as comparator.

2) Providing a fixed time i.e.. you detect 0.1v drop anywhere from 0.8v - 1.6v at a fixed rate. This also could be done using operational amplifiers but a mcu is the most better option.

Good Luck

wiwah likes this.
3. ### t_n_k AAC Fanatic!

Mar 6, 2009
5,448
784
Depends on how rapidly the longer term variations (in the 0.8V to 1.6V region) occur with respect to the "sudden change" of 0.1V you need to to detect.

One method might involve using a long-term averaging circuit in conjunction with a comparator circuit fed by this longer term average signal and the unmodified input signal.

4. ### wiwah Thread Starter New Member

Jul 27, 2011
3
0
Hey, thanks so much for the quick replies.
I will research what you all have suggested and let you know how I get on.

Thanks again.

5. ### t_n_k AAC Fanatic!

Mar 6, 2009
5,448
784
This seems to work as far as simulation goes ...

• ###### Small Change on DC detector.jpg
File size:
36.6 KB
Views:
132
tsan and wiwah like this.
6. ### ErnieM AAC Fanatic!

Apr 24, 2011
7,906
1,789
Here's one way:

The - input sees your signal directly.

The + input sees a slightly reduced input (by the resistor divider ratio) version of the input, and will hold the steady state value for a time determined by C1.

Normally the - input is >> then the + input so the output is low.

When the input changes negatively by some significant amount, the - in becomes << the + input so the output changes high.

As the input changes positively the - input is >> then the + input (+ lags from the cap) so the output is low.

wiwah likes this.
7. ### t_n_k AAC Fanatic!

Mar 6, 2009
5,448
784
Nice idea - much simpler than what I suggested.

Can you get the discrimination of value changes over the full input range of 0.8 to 1.6V? What about false responses where the momentary drop is less than 0.1V? You might get a broad spread in switching threshold.

Might need to include some hysteresis perhaps.

I'll try playing with some simulations to see how it goes. Or have you already done that and come up with a reliable solution?

8. ### t_n_k AAC Fanatic!

Mar 6, 2009
5,448
784
Suppose one had the two following inputs. The first has 100mV transient changes superimposed on a slowly varying signal in the 0.8 to 1.6V range while the second has only 75mV transient changes on a similar slowly varying signal. Could a suitable implementation of the circuit differentiate the two cases?

File size:
26.6 KB
Views:
37
File size:
30.7 KB
Views:
37
9. ### THE_RB AAC Fanatic!

Feb 11, 2008
5,435
1,309
A similar thing was commonly done in televisions using a single transistor.

Bias a NPN transistor just on using a high value resistor (1 meg?) to +5v. Then capacitor couple the signal to it's base using a small cap.

Any significant negative going spike will cause the transistor to turn off for a short duration.

wiwah likes this.
10. ### wiwah Thread Starter New Member

Jul 27, 2011
3
0
Hey Everyone,

Well what I went with was just a simple RC circuit on the input signal (delay approx 1s). I then used a difference amp to compare the original and delayed signal. Then used a comparator to check if the difference was greater than the required threshold voltage. With basic tests it seems to perform ok. Thanks for the suggestions and if anyone knows of a better way or just suggestions to improve the circuit I'd love to hear them.

11. ### ErnieM AAC Fanatic!

Apr 24, 2011
7,906
1,789
That should work, it gives you a fixed reference to compare the difference to. My version the comparison will change when over the input range, and if the 0.1 volt level is critical you'll need that second stage.