Sampling battery voltages efficiently in low-power wireless mesh networks

Thread Starter

meld2020

Joined Mar 14, 2016
44
I want to use an off-the-shelf LPWAN radio product -- which has mesh networking capability and features an embedded microcontroller -- to sample a 12V battery's voltage. I don't want to do it often, so I'll configure the radio's interface to do synchronized sleep routines to wake it up and sample it as needed. I want to be reasonably safe about it: maybe Vds of a FET (mentioned below) is 30-40V to safely handle any battery, or incase I want to monitor a 24V battery later.

Anyhow, I've observed people don't like simple voltage dividers for this purpose due to the constant power demand over time. So my plan was to do it as follows:


battery1.png


In this way, the controller can trigger the IO pin, enable the voltage divider, wait a very brief moment, take a sample, then go back to sleep.

The questions I have:

1.) Is this unusual? (the method of sampling, versus other approaches)

2.) Is this unwise? (particularly, do parts of the circuit need better protection, such as perhaps a zener or TVS or something on the ADC pin?)

3.) Would a voltage divider really be that horrible? I believe the parameter I'm interested in for such a question is "input bias current" of an ADC channel, which would indicate the overhead the channel requires. Specifically, I am using a MC9S08QE32 embedded onto an XBee radio product, to which the "ACMP Electricals" table in the datasheet specifies an active supply current of 35uA max per channel:


acmp.png

This is not "input bias current" verbatim, but I'm assuming this is telling me what the channel requires to operate, thus, what I should expect to bleed constantly for sampling the battery if I decided not to sleep.


Thanks,
Mel
 
Last edited:

wayneh

Joined Sep 9, 2010
18,085
I don't quite understand the behavior of the GPIO pins but can you have one "float" while sleeping? I'm thinking your resistor voltage divider won't draw current if the GPIO port is not pulling the voltage down. Wake it up, ground the GPIO pin, measure the divided voltage on another input pin. Does that make sense? It would eliminate the MOSFETs and such.
 

ronsimpson

Joined Oct 7, 2019
4,645
to sample a 12V battery's voltage
I like your idea.

Look to see what the max impedance for the input of the ADC. Many micros I have used want 10k or less. You have about 1k now.

When the N-FET is on it pulls the P-FET Gate to ground. That puts the battery voltage across the G-S. Most likely it is only good for 20V! I say it works for 12V battery that might get to 15V during charging.
 

ronsimpson

Joined Oct 7, 2019
4,645
You could use a low voltage R-R input R-R output op-amp as a buffer in front of the ADC. Gain of 1. Many op-amps have a very low current draw on the input. That might allow you to increase the resistor divider by 100x or more. Then just leave it in place all the time.
 

Thread Starter

meld2020

Joined Mar 14, 2016
44
I like your idea.

Look to see what the max impedance for the input of the ADC. Many micros I have used want 10k or less. You have about 1k now.

When the N-FET is on it pulls the P-FET Gate to ground. That puts the battery voltage across the G-S. Most likely it is only good for 20V! I say it works for 12V battery that might get to 15V during charging.
Thank you for your feedback! Could you clarify why it's 1K ohm input impedance in its current configuration? (sorry, still learning the ropes here.) I am assuming the general idea with the brief <10K input impedance is so the ADC's input capacitor can charge to provide reliable power to the sampling circuit. If this is the case, the application affords me to simply sample the channel normally and wait to actually record it after several samples have already occurred, as I know doing it right away could probably skew the reading if it dips very briefly.

But yes, the thought was why not use higher orders of magnitude of resistors to simply draw far less power, so long as the ratios were proper? But that was the idea of the switched PFET/NFET circuit if input impedance requirements, briefly or not, became restrictive. I could then just turn it on and off as needed and only deal with leakage currents/quiescent consumption.

Would the op amp circuit still be needed for this battery voltage ADC sampler?
 

ronsimpson

Joined Oct 7, 2019
4,645
Thank you for your feedback! Could you clarify why it's 1K ohm input impedance in its current configuration?
I was doing the math in my head so.........
The ADC sees 1.25k//15k = 1.15k (almost 1k). The impedance into the ADC is 1.15k.
1773095294045.png
---edited---
Right now you have 1.25k & 15k. I wanted to know if going to 12.5k and 150k would work. Simple change.
It that is close adding a buffer you could go to 125k and 1.5M. Just an idea.
 
Last edited:

ronsimpson

Joined Oct 7, 2019
4,645
The top P=MOSFET will break in your circuit. Look at the data sheet. Most have a max G-S voltage of 20V but some can only take 15V. I put a 15V Zener across G-S but now I think it probably is not necessary with this circuit.
The P-MOSFET and divider is the same.
I changed the bottom transistor to a 2N3904 and used it to make a constant current source. The Base goes to the GPIO pin. She the GPIO PIN=0V there is no current. When GPIO = 5V (in this example) there is 5-0.6V=4.4V across R4. This makes current. That current appears in R3 and makes about 14V. The voltage across R3 (G-S voltage) will be 14V max or about the battery voltage-5V. This will save you MOSFET.
1773098060649.png
The graph shows the battery voltage in green 10V to 30V.
The middle voltage is the Gate voltage when the GPIO is at 5V.
The bottom trace is the ADC.
If your supply is 3.3V you will need to adjust R4.

RonS.
 

Thread Starter

meld2020

Joined Mar 14, 2016
44
Thanks Ron! This is very insightful and helpful to analyze. Some factors:

1.) Fortunately, I do have the ability to set the requirements of the battery pack. I shouldn't have any issues specifying a 12V battery source in the bill of materials, so that should help with the Vgs problem on the PFET until I can either find one suited to a broader battery voltage, or until I understand why inherently Vgs tends to be limited here. Do you think the JFET may be an uncommon transistor type for this problem?

2.) The supply will be 3.3V, yes.

3.) I did some digging into the microcontroller's datasheet, here are the ADC characteristics below. My embedded application is definitely configuring the ADC channels to operate in 10-bit mode. The clock speed that I refer to in my Codewarrior IDE with the XBee/Digimesh plugins has a layer built over it that refers to the clock speed as 40MHz. I don't know at this time if this is the actual fADCK that is mentioned below in determining analog source resistance limits, but I'll be conservative and assume I need to stay under 5kOhm for this value. This controller uses a SAR type ADC.

adc1.png

adc2.png

adc3.png

In another project, I currently use a 4-20mA current source in parallel with a 350-ohm resistor to provide whats usually a 1.4V to 2.4V scale of readings and feed that into a OPA241 to take that to 0.5 to ~2.0 to clean up the readings for our ADC (the same ADC and microcontroller above). Do you think it would be advantageous to run any signal through an opamp circuit of this sort, whether it be the battery voltage we've been talking about, or any reading from some kind of industrial 4-20mA sensor?

With all this being said, would you change much else if the battery was indeed 12V?

Thanks again!
 
Last edited:

ronsimpson

Joined Oct 7, 2019
4,645
I shouldn't have any issues specifying a 12V battery
(12V battery) If you use MOSFETs that can handle 20V G-S then your first circuit is good. Even if you charge the battery in circuit. (14V or 14.5V)
I need to stay under 5kOhm for this value. This controller uses a SAR type ADC.
It is not clear to me what impedance to stay under. I am guessing 5k ohms. I see they think there might be as much as +/-1uA of leakage current on the pins. So a 1K impedance might cause as much as 1mV of error. (worst case) Some companies have a page on how to use the ADC. I did not find any better information.
 

Thread Starter

meld2020

Joined Mar 14, 2016
44
Just as a follow-up on this, I wanted to update the direction I'm taking with it. Since I do have design control over the battery system used, and since one of the primary goals of the circuit is to save power, I thought kicking it down to using a 4.2V system would ultimately consume less power. This is because that all systems involved with deep sleep, mesh network "synchronized sleep" and other power save management features would be centered around the microcontroller itself, which has a requirement of ~ 3.5-4.6 Vin.

The selective switching circuit above aims to integrate microcontroller functionality with industrial sensors; particularly 4-20mA pressure sensors, which simplify wiring and ensure accuracy independent of installation runs, however are by no means "low power" for this type of application. In a separate, unmentioned part of the design while it was still a 12V battery system, I was weighing in my options between SMPSs and LDOs to provide the ~3.5-4.5V and determining what would better minimize loss since it would have to be running full time.

I then changed my mind again to instead keep the battery system a native voltage for the microcontroller, and since we're selectively sampling (switching) the industrial sensor sampling on and off very infrequently and as needed, why not include the lossy regulator within the confines of that?

Which brings me to my next question. Would something like this work? (I suppose you can mostly disregard the values of the RCL components, as they are mostly to configure the regulation output of the IC.)

adc_plus+boost.png
 

AnalogKid

Joined Aug 1, 2013
12,043
For all of your MOSFETs (Reference Designators - !) the drain and source labels are reversed. The circuit cannot function as indicated.

ak
 
Last edited:

ronsimpson

Joined Oct 7, 2019
4,645
On the LMR61428, pull pin 2 to ground and the IC shuts down. It pulls 0.01uA typical to 2.5uA max.
----edited----sorry it will not work. VBat+ will flow through D1 and lift Vout to the battery voltage. -----
1775494243966.png
 

Thread Starter

meld2020

Joined Mar 14, 2016
44
Thank you for pointing this out. Here is what I think I am trying to accomplish, with graphic indication on FETs corrected and the boost regulator slightly altered in wiring configuration at the BOOT, EN, VDD pins. (The output will be configured to provide anywhere from 10.5V to about 12.6V, and the absolute physical limit on the BOOT(strap) pin is 10V, thus the datasheet advises to wire it to whatever provides Vdd.

If the PFETs are driven ON by applying a positive voltage to the gate of the NFET (thus placing the gate voltages at 0 for both PFETs and allowing them to conduct), then I'm trying to just form a high-side switch to selectively turn on power to the LMR61428. (datasheet) I would also have control over the length of time this occurs in the code for the GPIO.

The goal is to provide a fairly stable 10.5-14V (12V) for the industrial/4-20mA pressure sensor every so often, then turn it off and sample again later. In this circuit, basically the left half of the circuit is for the battery voltage sampling at the ADC, and the right half of the schematic is to enable power to the pressure sensor, which I will also sample via another ADC channel (not yet pictured.) I would likely sample the battery voltage independently some small amount of time before or after (requiring two, independent GPIO channels) OR at the same time using one GPIO channel (shown) but with a small wait to allow the battery voltage reading to stabilize after any inrush, should that actually be significant.

adc_plus+boost.png
 
Last edited:

Thread Starter

meld2020

Joined Mar 14, 2016
44
To add, I am just bridging the Pfet switched voltage to LMR61428's EN full time since I wouldn't energize that branch of the circuit otherwise.
 
Top