Making an Electronic Weight-Based (Non-Tipping) Rain Gauge

Thread Starter

wxman

Joined Oct 13, 2022
69
Most digital, consumer rain gauges use the tipping bucket design where the tipping buckets fill with a known amount of water (typically the amount that corresponds to 0.01" inch or either 1mm, depending on region), then flip over to dump the water while triggering a reed switch. It's a simple and low cost design, but accuracy is +/- several percent, at best as it's sensitive to rain rate, etc.. For that reason, they're often not accepted for official measurements (ie. the Cocorahs program with the United States National Weather Service requires the use of a manual rain gauge, but they do allow you to weigh the collected water contents on a kitchen scale for precise accuracy.)

This got me thinking of affordable ways to more-accurately automate the process by scrapping the tipping buckets for a weight-based approach using load cells and a Raspberry Pi....But it's not without drawbacks of it's own, which I will point out later...

My initial thinking was using an upside down bucket with a funnel as the rain entrance. Water flowing out of that funnel then passes through a normally-open solenoid, drains into a PVC pipe, which has a normally-closed solenoid on it's bottom so that water is stored inside the PVC pipe. The PVC pipe is mounted against a piece of flat board with load cells connected. At a given time interval (say once per day) and/or at a certain rain amount (before the pipe fills), the system automates the emptying process by applying voltage to the 2 solenoids. This causes the bottom solenoid to open, draining the stored water out of the pipe, while the voltage closes the entrance funnel solenoid (in case it's still raining, that rain will be temporarily stored in the top funnel)...After a few seconds/minute (long enough to empty), the power cuts off to the solenoids, closing the bottom solenoid and opening the top solenoid.

Sketch:

weight-rain-gauge1.JPG

Sketch while collecting and weighing rain:

weight-rain-gauge04.JPG

Sketch During Emptying:

weight-rain-gauge05.JPG

As alluded to earlier, there are some issues with the design...Most significant being "load cell creep" where the weight values drift over time when the cells are under constant load. There could also be drift due to temperature, humidity and other environmental changes.

A potential work around could be to automate the tare function frequently to zero the scale before any drifting occurs and use a script to log the values before each tare which it can add for calculating daily rain. It would also have to tare after each emptying phase completes.

The size I would need to make the gauge depends on the significance of this drifting. Example, I would prefer it to be accurate to the nearest 0.01" inch...If I use an 8 inch (~25cm) diameter collection funnel, that would be approximately 4 grams of water weight for each 0.01".....Larger diameter collection area would mean a greater weight in grams per 0.01" (thus the more weight error I could stand while maintaining the needed accuracy)...On the flip side, the larger the gauge, the harder it is to work with and the heavier your load (less precision from the load cell)...Not having worked with load cells under constant and changing loads, as well as subject to quick temperature changes, I'm not really sure what kind of drifting errors I should expect.

I would also need to figure some sort of switch that could activate by script to send voltage for emptying at the appropriate times.

Interested in hearing other's thoughts on how to make this work and any possible improvements.
 
Last edited by a moderator:

crutschow

Joined Mar 14, 2008
38,423
Seems like a lot of added complexity to gain a small amount of accuracy.
Are you looking to make this a commercial product?
 

Sensacell

Joined Jun 19, 2012
3,780
Skip the loadcell idea and just trigger at a specific water level.
This would be far simpler and more accurate.

You could detect the level using optical, electrical conductivity, capacitance, or even a simple float switch.
 

MrAl

Joined Jun 17, 2014
13,686
Most digital, consumer rain gauges use the tipping bucket design where the tipping buckets fill with a known amount of water (typically the amount that corresponds to 0.01" inch or either 1mm, depending on region), then flip over to dump the water while triggering a reed switch. It's a simple and low cost design, but accuracy is +/- several percent, at best as it's sensitive to rain rate, etc.. For that reason, they're often not accepted for official measurements (ie. the Cocorahs program with the United States National Weather Service requires the use of a manual rain gauge, but they do allow you to weigh the collected water contents on a kitchen scale for precise accuracy.)

This got me thinking of affordable ways to more-accurately automate the process by scrapping the tipping buckets for a weight-based approach using load cells and a Raspberry Pi....But it's not without drawbacks of it's own, which I will point out later...

My initial thinking was using an upside down bucket with a funnel as the rain entrance. Water flowing out of that funnel then passes through a normally-open solenoid, drains into a PVC pipe, which has a normally-closed solenoid on it's bottom so that water is stored inside the PVC pipe. The PVC pipe is mounted against a piece of flat board with load cells connected. At a given time interval (say once per day) and/or at a certain rain amount (before the pipe fills), the system automates the emptying process by applying voltage to the 2 solenoids. This causes the bottom solenoid to open, draining the stored water out of the pipe, while the voltage closes the entrance funnel solenoid (in case it's still raining, that rain will be temporarily stored in the top funnel)...After a few seconds/minute (long enough to empty), the power cuts off to the solenoids, closing the bottom solenoid and opening the top solenoid.

Sketch:

View attachment 353472

Sketch while collecting and weighing rain:

View attachment 353473

Sketch During Emptying:

View attachment 353474

As alluded to earlier, there are some issues with the design...Most significant being "load cell creep" where the weight values drift over time when the cells are under constant load. There could also be drift due to temperature, humidity and other environmental changes.

A potential work around could be to automate the tare function frequently to zero the scale before any drifting occurs and use a script to log the values before each tare which it can add for calculating daily rain. It would also have to tare after each emptying phase completes.

The size I would need to make the gauge depends on the significance of this drifting. Example, I would prefer it to be accurate to the nearest 0.01" inch...If I use an 8 inch (~25cm) diameter collection funnel, that would be approximately 4 grams of water weight for each 0.01".....Larger diameter collection area would mean a greater weight in grams per 0.01" (thus the more weight error I could stand while maintaining the needed accuracy)...On the flip side, the larger the gauge, the harder it is to work with and the heavier your load (less precision from the load cell)...Not having worked with load cells under constant and changing loads, as well as subject to quick temperature changes, I'm not really sure what kind of drifting errors I should expect.

I would also need to figure some sort of switch that could activate by script to send voltage for emptying at the appropriate times.

Interested in hearing other's thoughts on how to make this work and any possible improvements.
Hello there,

Can't you get the measurement from the volume collected?

Also, what are you trying to measure, is it the daily rainfall or the hourly rainfall or the rate of rainfall?

8 inches is a good collection area size. That's about 50.3 square inches. That's about 27.9 ounces per inch of rainfall.
Using a diameter of 8.3 inches gets you close to a nice round 30 ounces per inch of rainfall.
Ounces here refers to U.S. fluid ounces.
The 30 ounces would weigh about 31.4 ounces (weight).
 
Last edited:

Thread Starter

wxman

Joined Oct 13, 2022
69
Seems like a lot of added complexity to gain a small amount of accuracy.
Are you looking to make this a commercial product?
At this point, I'm just trying to rig up something cheap to see what methods work. Bucket can be a paint bucket from the hardware store, funnel can be a car oil changing funnel, etc. Nothing professional-looking. I'm just trying to match the accuracy of a manual gauge in real time, without having to actually go out in the elements to take the measures.

If a method works exceptionally well, then I could always consider expanding on the idea, prettying it up and making available as a commercial product in the future. Many in the industry have expressed frustration with the effort and time constraints involved in manual human measurements, so there would be a market for it. But that's just a "maybe" for sometime down the road. For now, I'm just experiementing with a homemade solution for personal use.

Also, what are you trying to measure, is it the daily rainfall or the hourly rainfall or the rate of rainfall?
All of the above, actually. Ideally, the goal is to data log a new raw measurement every few seconds. Then let a script do the math to calculate all the needed variables; Daily total, montly/yearly totals, Rain rate (how much will fall if it continues at this rate for an hour)...There's scripts available (and open source software) that will calculate all that math, upload the data live to the internet, etc....I just need a hardware solution to sample either weight, depth or some other usable property of the water every few seconds so that the scripts can have data to apply the math to.

A weight-based approach seems to be the best option, if it can be done, for a number of reasons.

1. Weight method of placing the contents of the manual gauge on a kitchen scale is already approved for official use in National Weather Service reporting/records. An automated process that accurately replicates this would not need any additional approval.

2. Also captures the weight of droplets stuck to the inner sides of the gauge

3. Accuracy not sensitive to rain rate

4. Alternative bucket designs can be made for measuring the effective liquid equivalent of snow without the need to melt it first. This is because the weight of snow does not change once melted to water. Other methods require using a heating element to melt the snow first, then measuring the liquid drainage (leading to delayed measurements, evaporation during the heating process, etc.) But in this case, you could weight the snow instantly as it falls, then use a heating element after the fact only to assist in emptying (or skip the heating element totally and manually dump the bucket every few days or so).

Only issue would be how to deal with scale drift due to the load cell being under constant load as well as temp/humidity changes.

Another method considered was using either a laser rangefinder or ultrasonic sensor to measure the height of the water line in the gauge. But you would have to deal with waves/sloshing of the water level during heavy downpours which may make your data values jump all around. Plus you would have to melt snow/ice before measuring it's liquid equivalent. The cons seemed to start adding up the more I thought of these methods.

Skip the loadcell idea and just trigger at a specific water level.
This would be far simpler and more accurate.

You could detect the level using optical, electrical conductivity, capacitance, or even a simple float switch.
A little confused at how this could work. The goal is to measure some quantitative property (ie. weight to the nearest gram, depth to the nearest millimeter, etc.) of collecting water once every few seconds. Then a script can apply math to it to derive a live rain total, hourly rate, etc. after each data packet is received. If a data packet is only triggered at one specific water level, that level would have to be the equivalent to 0.01" in order to maintain accuracy to the nearesst 0.01"...That amount can be reached in under 1 second in heavy rain, so you'd be having to measure, dump, then measure again all within less than a second.

Whatever method is used would have to be a precision measurement value that increases linearly as the water level rises.
 

MrAl

Joined Jun 17, 2014
13,686
At this point, I'm just trying to rig up something cheap to see what methods work. Bucket can be a paint bucket from the hardware store, funnel can be a car oil changing funnel, etc. Nothing professional-looking. I'm just trying to match the accuracy of a manual gauge in real time, without having to actually go out in the elements to take the measures.

If a method works exceptionally well, then I could always consider expanding on the idea, prettying it up and making available as a commercial product in the future. Many in the industry have expressed frustration with the effort and time constraints involved in manual human measurements, so there would be a market for it. But that's just a "maybe" for sometime down the road. For now, I'm just experiementing with a homemade solution for personal use.



All of the above, actually. Ideally, the goal is to data log a new raw measurement every few seconds. Then let a script do the math to calculate all the needed variables; Daily total, montly/yearly totals, Rain rate (how much will fall if it continues at this rate for an hour)...There's scripts available (and open source software) that will calculate all that math, upload the data live to the internet, etc....I just need a hardware solution to sample either weight, depth or some other usable property of the water every few seconds so that the scripts can have data to apply the math to.

A weight-based approach seems to be the best option, if it can be done, for a number of reasons.

1. Weight method of placing the contents of the manual gauge on a kitchen scale is already approved for official use in National Weather Service reporting/records. An automated process that accurately replicates this would not need any additional approval.

2. Also captures the weight of droplets stuck to the inner sides of the gauge

3. Accuracy not sensitive to rain rate

4. Alternative bucket designs can be made for measuring the effective liquid equivalent of snow without the need to melt it first. This is because the weight of snow does not change once melted to water. Other methods require using a heating element to melt the snow first, then measuring the liquid drainage (leading to delayed measurements, evaporation during the heating process, etc.) But in this case, you could weight the snow instantly as it falls, then use a heating element after the fact only to assist in emptying (or skip the heating element totally and manually dump the bucket every few days or so).

Only issue would be how to deal with scale drift due to the load cell being under constant load as well as temp/humidity changes.

Another method considered was using either a laser rangefinder or ultrasonic sensor to measure the height of the water line in the gauge. But you would have to deal with waves/sloshing of the water level during heavy downpours which may make your data values jump all around. Plus you would have to melt snow/ice before measuring it's liquid equivalent. The cons seemed to start adding up the more I thought of these methods.



A little confused at how this could work. The goal is to measure some quantitative property (ie. weight to the nearest gram, depth to the nearest millimeter, etc.) of collecting water once every few seconds. Then a script can apply math to it to derive a live rain total, hourly rate, etc. after each data packet is received. If a data packet is only triggered at one specific water level, that level would have to be the equivalent to 0.01" in order to maintain accuracy to the nearesst 0.01"...That amount can be reached in under 1 second in heavy rain, so you'd be having to measure, dump, then measure again all within less than a second.

Whatever method is used would have to be a precision measurement value that increases linearly as the water level rises.
Hi,

Yes that starts to make sense. I was also thinking how much easier it would be to measure the weight at least in theory.

Well, I guess you know what you have to do then. Now and then empty the container and apply a known weight or weights and do the measurements. After that, apply the curve.

I mention applying a set of standard weights because that's the way it is done in the medical research field where weights absolutely must be accurate with no room for (much) error. The electronic scales would be calibrated with a set of standard weights. Back in the 1970's, the curve would be adjusted using diode break points but with microcontrollers being so common now you won't have to do that. Once you determine the curve, you can apply that, then recalibrate again later.

To change weights I guess you could rig up something robotic but it would be simple. Alternately use a known volume of oil or water. Also, an electromagnetic system that applies a known weight controlled by current through a coil, which would be easier to change weights for calibrations.

From what little I remember, I think the creep is sort of logarithmic so most of the changes occur earlier on, and then gradually taper off. You could do some simple experiments to find out.
Also, I think there are load cells today that have automatic creep compensation, but I don't know of any model numbers offhand. You could at least check into that and if the weight ranges are ok for your application.
 

MisterBill2

Joined Jan 23, 2018
27,362
OK I see several challenges here: First, the expense of an ACCURATE Weight based measurement system, meaning one that does not vary due to the quantity of water that does not drain out completely. Second, the resolution of the analog to digital converter (How many bits), plus the stability and linearity of the analog portion of the measurement system.
Then there are the additional INTRINSIC limitations of the weigh scale system, which are the accumulation time for some quantity of rain to accumulate. As well as the requirement to periodically operate the drain valve, which may develop leakage issues, and certainly will require power to operate.
Certainly a well designed weigher system will be more accurate than a cheapest budget priced non-shielded tipping bucket system.
Another option would be a spring-scale type system that would break a light beam when an adequate weight of water had accumulated. This would avoid the requirement of an accurate load cell and amplifier, as well as the need for an analog to digital converter. Certainly it will require accurate calibration, as will the load cell system and the tipping bucket scheme.
The bottom-line reality is that a cheaply done version of any system will not provide the required accuracy, while a well-designed version can easily provide results consistently within 1%, and even better with additional calibration effort.
 

MisterBill2

Joined Jan 23, 2018
27,362
An accurate volume measuring system can be much more stable than an inexpensive load cell weight scale system. Normal accurate load cells include four elements in a bridge arrangement to avoid the problem of the element resistance changing with temperature. The system also requires an amplifier with stable gain and offset, and a stable power supply for the bridge sensors.
Stability is certainly possible but seldom cheap.
 

MrAl

Joined Jun 17, 2014
13,686
An accurate volume measuring system can be much more stable than an inexpensive load cell weight scale system. Normal accurate load cells include four elements in a bridge arrangement to avoid the problem of the element resistance changing with temperature. The system also requires an amplifier with stable gain and offset, and a stable power supply for the bridge sensors.
Stability is certainly possible but seldom cheap.
Hi,

Yeah I am assuming this is important enough that the expense is not as much of a concern. Either that or he is willing to go the distance with the calibration procedures outlined previously.
 

MisterBill2

Joined Jan 23, 2018
27,362
Consider that measuring accumulated rainfall will always require time, while the "bucket dumping" system senses in much smaller increments, it seems that possibly a system using both could be created. As each volume was dumped, it could also be accumulated and weighed, which could then easily provide a redundant measure of the same rainfall.
THAT SCHEME could certainly increase the confidence level of the collected data.
 

MrAl

Joined Jun 17, 2014
13,686
Consider that measuring accumulated rainfall will always require time, while the "bucket dumping" system senses in much smaller increments, it seems that possibly a system using both could be created. As each volume was dumped, it could also be accumulated and weighed, which could then easily provide a redundant measure of the same rainfall.
THAT SCHEME could certainly increase the confidence level of the collected data.
Yeah depending on the allowable level of complexity.

At one time I considered getting my own weather station for the home. There's a bit of maintenance associated with that though and with the advent of the internet weather services I scrapped the idea.
 

MisterBill2

Joined Jan 23, 2018
27,362
Back at post #1 I see the problem: "Most digital, consumer rain gauges use the tipping bucket design", which is that the assumption is made that those are the only choice available. Even among "consumer" products there is a large quality spread. AND there is also the very real option of building one's own device with much greater effort toward low friction and complete water draining. And it is certainly possible to provide adequate protection from both wind and rain impact so that neither of those have any influence.
As in all measurements, the results depend partly on the selected point of reference.
 

Thread Starter

wxman

Joined Oct 13, 2022
69
Great discussion! Thank you to everyone offering thoughts and suggestions. It's much appreciated!

For a weight solution, I would probably connect a load cell to a raspberry pi using an HX711 chip. That will amplify and convert the load cell voltages into a digital weight and has calibration options. Calibrating for known weights is the easy part. Hard part is dealing with drift over time due to creep and temperature swings from being outdoors.

It is true that when a constant load is placed on the load cell, creep is most significant after the first few minutes (as the load cell deforms from the weight), then tends to level off. While my setup would be constantly under weight (weight of the empty collection container which would be zeroed out in your final output), the amount of weight would also be changing as rain fills the collection container. So even if creep from the weight of the container subsides after a while, will the creep start again when additional weight from rain gets added later?

Maybe one way to avoid creep would be avoiding the constant load by having some type of lift motor that lowers and raises the gauge onto and off of the scales every so many seconds? Say it lowers onto scale for 20 seconds while taking a few weight measures and averaging. Then lifts off the scale for 20 seconds, zeros the scale during that time, then lowers back onto the scale for more measures. Something like a motorized pan/tilt bracket for a security camera. Or maybe there's other ideas that could raise/lower on a time scale?

Then comes the temperature correction. I could always put a temperature sensor on the load cell. If the calibration is linear, then set it based on 2 or 3 different temperatures. If the temperature calibration is non-linear, then I don't know how I could check for every possible temperature to make the curve.

As the load cell calibration may be a time consuming process, I'm also rethinking the possibility of temporarily using the laser rangefinder idea. Biggest drawback there being waves and splashing of water in the collection pipe. Maybe if I have a plastic float on top of the water level, that would help stabilize the water?

laser1.JPG
Or alternatively, use a second, smaller diameter entrance pipe that extends almost to the bottom of the measuring container so that water fills the measuring container from the bottom:

laser2.JPG

Any thoughts on that? or other ideas to stabilize the splashing/water waves?

Even an ultra cheap rangefinder should work providing there's enough vertical distance between significant values..

There could be issues with condensation forming on the rangefinder sensor being that it's inside a mostly-closed container of water. Even the manual gauges sometimes get fog/steam droplets on the sides of the measuring tube; Especially if the water sits for an extended period or if there's big temperature changes. Perhaps there would need to be a very small heating element against the rangefinder to limit this.
 
Last edited:

MikeA

Joined Jan 20, 2013
446
I have been carrying this idea in my brain for over a decade, but never built it, and it wasn't for rainfall monitoring, but for monitoring the water level in a basement sump pump well.

The idea was to use an ultrasonic sensor that costs a few bucks, like the HC-SR04, to measure the distance to the surface of the water.

I've played with that sensor on the bench, and it is extremely accurate, to 1mm with averaging. So much so that when I had it pointed it at a wall about 12" away for a week, I could track the change of temperature & humidity in the room, since that affects the speed of sound. So you'll probably get much less drift vs. measuring weight, since calibrating for sound speed would be much easier.
 

Thread Starter

wxman

Joined Oct 13, 2022
69
I have been carrying this idea in my brain for over a decade, but never built it, and it wasn't for rainfall monitoring, but for monitoring the water level in a basement sump pump well.

The idea was to use an ultrasonic sensor that costs a few bucks, like the HC-SR04, to measure the distance to the surface of the water.

I've played with that sensor on the bench, and it is extremely accurate, to 1mm with averaging. So much so that when I had it pointed it at a wall about 12" away for a week, I could track the change of temperature & humidity in the room, since that affects the speed of sound. So you'll probably get much less drift vs. measuring weight, since calibrating for sound speed would be much easier.
I actually was looking at that HC-SR04 ultrasonic sensor, as well as the VL53L0X laser rangefinder...Both are very cheap and could work providing I size the gauge so that the needed significant value (0.01" rain equivalent) is far enough apart.

While I'm not positive which would work better, I'm leaning more toward the laser. From doing a little research on measuring water levels in a pipe, I see where ultrasonic is generally not recommended for small diameter pipes and if water droplets are on the sides of the pipe, the sound wave could reflect off of those drops. I would think that the more narrow beam of a laser would perform better in this case....Not sure, though.
 

Tonyr1084

Joined Sep 24, 2015
9,744
Just a passing thought: since water has weight and you can probably get an old MAP sensor (Manifold Absolute Pressure) from a junk yard, and given that it is designed to work in a wide variety of conditions, you could capture rain water. As its weight increased due to the rise in water pressure the MAP sensor would detect the change and output a signal according to what it sees. When the catch vessel gets full to the point of the overflow tube the overflow switch would activate a solenoid. A timer circuit could keep the solenoid active for 10 seconds (just guessing) and drain the vessel. Monitoring the number of times the overflow sensor activates you can determine over a period of time how many occurrences of overflow and come up with a known weight for the water. No moving parts, a MAP that is designed for temperature extremes and a catch vessel of significant enough size to increase the sensitivity of the MAP to the point you desire. Tubing and drains can be larger as well. Just be sure to calculate the water in the tube as well.
View attachment 353830
Just a thought.
 
Last edited:

MisterBill2

Joined Jan 23, 2018
27,362
REALLY, the MAP sensor tube can extend down into the collection container. No need for a "T" fitting below. AND the dip tube diameter can be large enough to avoid any water hanging inside when it is drained.
 

Tonyr1084

Joined Sep 24, 2015
9,744
REALLY, the MAP sensor tube can extend down into the collection container. No need for a "T" fitting below. AND the dip tube diameter can be large enough to avoid any water hanging inside when it is drained.
The TEE is for the drain. There's a solenoid that opens whenever the vessel overflows. It can also be reset by the press of a button. I didn't draw out the entire circuit. It should be understandable from a concept point of view. The MAP sensor should be mounted above the vessel so as to not get water inside it.

If you were to put a dip tube into the tank it would need to be very close to the bottom. Otherwise there could be error introduced into the reading.

Anyway, it's not a serious addition to this thread, just an attempt to spark different thinking.
 
Top