Detecting Dips in Battery voltage

Thread Starter

phippstech

Joined Jan 27, 2020
19
I have circuit that I am trying to construct that will detect a small dip in voltage (like 300mV). The voltage in question (V1) can be 12.5 to 14.4 V. The goal is to have the output a signal to wake up a MCU. Power consumption is a big deal here so we don't want the MCU on unless it needs to be on. So reading analog voltage levels with it is out of the question. The circuit works when the difference in voltage is about 500 mV to 700 mV but still sometimes doesn't trigger the output. I feel that the capacitor value that we are using isn't high enough, so itdoesn't hold enough charge to hold the voltage constant on the + pin long enough for the comparator to detect the difference in voltage. Is there another low power IC out there that will detect a small dip in voltage?


Picture1.png
 

crutschow

Joined Mar 14, 2008
34,281
What's the slowest dip in voltage you want to detect?
The sensitivity of the circuit you show depends upon how rapidly the voltage changes.
It has little to do with the sensitivity of the comparator (which will respond to a 10mV difference between the inputs).
 

Thread Starter

phippstech

Joined Jan 27, 2020
19
What's the slowest dip in voltage you want to detect?
The sensitivity of the circuit you show depends upon how rapidly the voltage changes.
It has little to do with the sensitivity of the comparator (which will respond to a 10mV difference between the inputs).
So basically the dip in voltage will last around a second. The reason for the dip is just things waking up (leds, displays, etc). I measure the demand on the battery to be around 200 to 300 mV. While agreeing with you on the 10mV sensitivity of the inputs being compare, that 200 to 300 mV should be plenty for the comparator.

I'm thinking the combination of the capacitor discharging too quickly, i.e. losing voltage and the "dip" voltage (IN-) coming back to normal doesn't give the comparator enough time to detect the (IN-) falling below (IN+), consistently.
 

crutschow

Joined Mar 14, 2008
34,281
I'm thinking the combination of the capacitor discharging too quickly,
Could be, but you didn't post the size of the resistor and capacitor, so I can't tell.
You need an RC time constant of around 1 second, thus for a 1 megohm input resistor, capacitor should be 1μF.

But that will be sensitive to a few mV of voltage drop, not just 300mV.
Is that okay?
If not then you will need to add an offset bias of 300mV to the inputs, such as with a large resistor from the plus input to ground (about 42 times the input resistance value).
 

Thread Starter

phippstech

Joined Jan 27, 2020
19
Could be, but you didn't post the size of the resistor and capacitor, so I can't tell.
You need an RC time constant of around 1 second, thus for a 1 megohm input resistor, capacitor should be 1μF.

But that will be sensitive to a few mV of voltage drop, not just 300mV.
Is that okay?
If not then you will need to add an offset bias of 300mV to the inputs, such as with a large resistor from the plus input to ground (about 42 times the input resistance value).
Interesting. We were testing the circuit with 2200uF capacitor and 470 ohm resistor.
But interesting enough we put two voltage dividers on each of the inputs to cut the input voltages in half and that seem to make the circuit work well.
At this point we are trying to make the circuit well with either 10uF cap or two 10uF in parallel. So in theory, in order for tau (time constant) to increase, the resistor value needs to increase correct? The circuit seems to want to work with a 10uF capacitor,but the output is about 50 percent of the supply voltage.
 

Thread Starter

phippstech

Joined Jan 27, 2020
19
Yes. you would increase the resistor value to increase the time-constant (R x C).
Thanks, I think I'm in a better place with this stuff now. Appreciate your help. I haven't used this platform, anyway I can vote one of your responses the correct answer?
 
Top