What is the standard for pressure?

joeyd999

Joined Jun 6, 2011
5,283
The water column method is very similar to the, "millimeters of mercury" method, but more convenient for rather small amounts of pressure difference.
*But* water vapor pressure (and associated temperature) has a great deal more impact on overall accuracy, and must be accounted for.
 

cmartinez

Joined Jan 17, 2007
8,253
#12, I know you were not talking to me, but here's my 2 cents. I think it's more accurate to measure pressure (with a portable apparatus, outside the lab) using mercury instead of water, since water vapor tends to (technically) have a more significant effect than mercury's. Water is constantly evaporating (maybe mercury too, but at what rate?). Also the effects of change in volume due to temperature variations would have to be taken into account.

Another direct measurement technique that has not been mentioned in this discussion is the use of load cells. I've successfully used these in the past, for instance. Of course, I'm not 100% sure if a load cell could assertively be considered a direct measurement technique or not.
 

WBahn

Joined Mar 31, 2012
30,058
Air pressure is also measured in units called, "inches of water column" and, "pounds per square inch". Either of those measurements can be performed directly by a mechanical apparatus. The water column method is very similar to the, "millimeters of mercury" method, but more convenient for rather small amounts of pressure difference. An apparatus for measuring directly in pounds per square inch is a lot more inconvenient, but it can be done.

So, of all these direct measurement techniques, why do you pick the mercury method as less valid?
Do the millimeters of mercury not convert to inches of water column or pounds per square inch?
I never said that I was picking a mercury manometer as less valid than those others. The claim that I am opposing is that the official standard for the definition of pressure is based on a mercury manometer. It isn't. The use of any manometer for the purpose of defining pressure would be unacceptable for two very simple reasons -- a weak reason would be its dependence on temperature. A much stronger reason is its dependence on the local gravitational constant. You cannot convert millimeters of mercury to pounds per square inch without taking both of those into account, even if only implicitly.

The international standard for pressure is the Pascal, which is a pressure of exactly one newton per square meter. The international standard for a newton is a (net) force that will accelerate a one kilogram object at a rate of one meter per second per second (in an inertial reference frame). The international standard for the kilogram is a mass equal to the prototype kilogram, a platinum-iridium cylinder maintained at the International Bureau of Weights and Measures (though efforts are on-going to redefine mass in terms of physical constants instead of a physical artifact). The international standard for the meter is the distance traveled by light in a vacuum in 1/299,792,458 th of a second. The international standard for the second is the duration of 9,192,631,770 periods of the radiation corresponding to the transition between the two hyperfine levels of the ground state of the cesium 133 atom.

Notice that none of these standards, which are legally binding by international treaty among all signatory nations, say anything about mercury or manometers.
 

cmartinez

Joined Jan 17, 2007
8,253
The claim that I am opposing is that the official standard for the definition of pressure is based on a mercury manometer.
Could it be that the problem lies in that pressure is rather what I'd call a "composite unit" instead of a fundamental one? And as such, there is no real fundamental way to define it?

Still, there are instruments out there that are certified as pressure references, in order to calibrate other instruments. I'm not sure how this certification takes place, though.
 

Thread Starter

#12

Joined Nov 30, 2010
18,224
You cannot convert millimeters of mercury to pounds per square inch without taking both of those into account, even if only implicitly.
So who said you are not allowed to take them into account in order to use a known and respected method to arrive at an accurate measurement?

Here is a statement of a conversion factor:

One atmosphere (1 atm) is equivalent to 29.92 inches of mercury.

https://en.wikipedia.org/wiki/Barometer

I believe your question is, "Who said so and is he enough of an Authority to make it Official enough for WBahn?"

I might eventually find out who decided that this is the proper conversion factor. Then you can argue that he is not Official enough to suit you. In fact, I think you already proved that nobody is, except possibly a unanimous decision by every country on the planet and a Law passed to enforce it.

This is a published and accepted standard which is not legally binding on WBahn.
I am going to use the published standard until somebody points me to a better one.
 
Last edited:

GopherT

Joined Nov 23, 2012
8,009
I never said that I was picking a mercury manometer as less valid than those others. The claim that I am opposing is that the official standard for the definition of pressure is based on a mercury manometer. It isn't. The use of any manometer for the purpose of defining pressure would be unacceptable for two very simple reasons -- a weak reason would be its dependence on temperature. A much stronger reason is its dependence on the local gravitational constant. You cannot convert millimeters of mercury to pounds per square inch without taking both of those into account, even if only implicitly.

The international standard for pressure is the Pascal, which is a pressure of exactly one newton per square meter. The international standard for a newton is a (net) force that will accelerate a one kilogram object at a rate of one meter per second per second (in an inertial reference frame). The international standard for the kilogram is a mass equal to the prototype kilogram, a platinum-iridium cylinder maintained at the International Bureau of Weights and Measures (though efforts are on-going to redefine mass in terms of physical constants instead of a physical artifact). The international standard for the meter is the distance traveled by light in a vacuum in 1/299,792,458 th of a second. The international standard for the second is the duration of 9,192,631,770 periods of the radiation corresponding to the transition between the two hyperfine levels of the ground state of the cesium 133 atom.

Notice that none of these standards, which are legally binding by international treaty among all signatory nations, say anything about mercury or manometers.
@WBahn
There are plenty of legally binding ISO standard procedures that validate the quality or quantity of a commercial transaction where the use of a mercury manometer is either required or listed as one measuring option. I don't know of any that allow the use of a Cesium 133 time source. They may exist but I have never heard of one used in a quality test method or a quantity test method.

I understand that SI quantities (primary or derived) are important standards, but those are way different than practical implementation in the field. Any business you plan to start would go broke immediately if you would insist on those types of measurement in your QA/QC labs.

And, anyone who does not use the temp correction info stamped on the side of the manometer is an idiot. It is part of the measurement.
 
Last edited:

Thread Starter

#12

Joined Nov 30, 2010
18,224
legally binding ISO standard procedures that validate the quality or quantity of a commercial transaction where the use of a mercury manometer is either required or listed
That's a source I didn't think of!
Still, is it signed by every country on the planet?
Is it legally binding in every country on the planet?
Does it rise to the level of WBahnally Official?

Is there any standard that is signed into Law by every country on the planet?

Edit: Coincidentally, those three requirements are in the correct order of stringency.;)
 
Last edited:

WBahn

Joined Mar 31, 2012
30,058
Could it be that the problem lies in that pressure is rather what I'd call a "composite unit" instead of a fundamental one? And as such, there is no real fundamental way to define it?

Still, there are instruments out there that are certified as pressure references, in order to calibrate other instruments. I'm not sure how this certification takes place, though.
Pressure is a "derived unit", meaning that it's definition (and it appears that quite a few people are unclear on what a "definition" is) is in terms of the fundamental units. The fundamental units, in turn, are defined (and redefined as the technology evolves). I've given the definition of pressure -- it is the force exerted per unit area and the unit of measurement is the pascal, which is defined as exactly one newton per square meter. That is the definition of the unit of measure for pressure and what it means.

Have you have heard of the term "NIST Traceable"? That is integral to how high quality certifications occur.

In the context of measurements, a "standard" is THE one, single, defining measure for some quantity. A look at how standard length has been defined provides a good example. In the latter part of the mid-1600's (for historical context a couple of decades before the infamous Salem Witch Trials) it had become evident that a standard for length measurement was needed. Two proposals were drawn up. One was based on defining the unit of length to be the length of a pendulum that had a half period of one second. The other was to define it in terms of the circumference of the Earth. The first one was rejected because, even then, it was known that the gravitational constant varies from place to place on the Earth's surface sufficiently so as to render such a standard useless (and this exact same fact would render using a mercury manometer as a standard for anything equally useless). So they chose to define it in terms of the circumference of the Earth. But, knowing even at that time that the Earth was not a perfect sphere, they had to specify a specific path along which to measure that circumference. The obvious choice is the equator, but they quickly determined that actually making that measurement would be impossible (at the time) due to the expanses of water involved. So instead they chose to define the distance from the equator to the north pole along the meridian that passed through a particular point in Paris, France as being exactly ten million meters (the name of the new unit). The process to get to this point took over a century and the meter (or metre) was adopted as the official unit of measure, with that definition, by France in 1793. As this was shaking out, it was understood that this definition required that a much better measurement of the meridian be taken, so a nearly decade-long expedition set out to measure very accurately the distance of a portion of this meridian. Unlike the equator, this is possible because the determination of latitude to a high degree of accuracy was possible. They chose as their segment the distance between two belfries, one in Barcelona and one in Dunkirk.

As it turned out, the irregularities of the Earth's surface proved that this definition for the length standard was unworkable. In about 1875, after a series of international conferences, the International Bureau of Weights and Measures (BIPM in French) was established in Serves, France and charged with creating a "standard prototype" metre to serve as the standard for the meter. So they used these measurements to estimate the length of this meridian and then took a platinum-iridium bar and, with it stabilized at the melting point of water, inscribed two thin marks and defined the distance between those two marks at that temperature as exactly one meter. The distance from the equator to the north pole was now irrelevant because it was no longer part of the definition. It turns out that, due to miscalculations in taking into account flattening effects due to the curvature of the Earth, this standard is actually about 200 um (0.02%) shorter than was intended. Nonetheless, it WAS the definition of the length of one meter.

At that point, there was exactly ONE object that was THE standard for the length of the meter. So how does a machinist or a scientist make a length measurement using this standard? They use a traceable reference. The BIPM was also charged with creating "reference prototypes" (sometimes called "secondary standards") that were compared to the prototype standard and documented. These were then distributed to various places that usually made subsidiary references that were distributed to lower levels until you got to instrument makers that used their references to make the measuring devices used by those machinists and scientists and whoever else. At each stage the reference against which the next generation of instrument was calibrated is documented and part of that documentation is whether the reference's calibration can be traced all the way back to the prototype standard. In countries that have an appropriate bureau of weights and measures it is sufficient to trace it back to that bureau (NIST, in our case) since that bureau is responsible for tracing back all of its reference standards to the actual standards.

But using a physical standard prototype has issues, not the least of which is the risk of the prototype being damaged or destroyed. It also has logistical issues of needing to physically compare things to an object that is located in one place. So even before 1900 efforts were underway to redefine the meter in terms of the wavelength of particular emission lines of light. This succeeded in 1960 when the meter was redefined as being equal to 1,650,763.75 wavelengths of the orange-red line in krypton-86 in a vacuum. But this placed uncertainties due to measurement methods that were eliminated by later (in 1983) redefining the meter directly in terms of the speed of light in a vacuum, namely as 1/299,792,458th of the distance traveled by light in a vacuum.

With this definition of the meter, standards laboratories, such as NIST, can create reference standards directly from the definition instead of having to compare them to some physical prototype. They can also generally do so much more accurately because the major limiting factors, such as the temperature of the standard and reference bars, are no longer an issue.

Which is not to say that there aren't still limiting factors, they are just less. For instance, today measurement labs usually use "delineate" (they do NOT use the word "define", because it is NOT the definition) the length of a meter as 1,579,800.762042 wavelengths of HeNe laser in vacuum. In realizing a meter reference prototype, the accuracy is limited by the ability to determine the frequency of the laser, which is several orders of magnitude worse than our current ability to realise a period of one second based on its current definition.

But once a reference prototype is made based on the standard definition, the process is still the same as before with references being made available, for sale or rent, by the standards bureaus that are then used to make instruments all while being traceable back to the standard.

So anytime someone goes on about how most people don't measures the length of an object using wavelengths of light, they are really just showing that they have no idea what a measurement standard is and how standards are used.
 

WBahn

Joined Mar 31, 2012
30,058
@WBahn
There are plenty of legally binding ISO standard procedures that validate the quality or quantity of a commercial transaction where the use of a mercury manometer is either required or listed as one measuring option. I don't know of any that allow the use of a Cesium 133 time source. They may exist but I have never heard of one used in a quality test method or a quantity test method.

I understand that SI quantities (primary or derived) are important standards, but those are way different than practical implementation in the field. Any business you plan to start would go broke immediately if you would insist on those types of measurement in your QA/QC labs.

And, anyone who does not use the temp correction info stamped on the side of the manometer is an idiot. It is part of the measurement.
Show me were I ever said that it was illegal to use a mercury manometer for measuring pressure. You can't, because I never made any such claim.

It is clear that that you don't understand how standards work. Have you ever heard of the term "NIST Traceable"? If you have ANY instrument that involves time measurement in any way whose calibration is NIST Traceable, then it is being referred back to the definition of a second based on the NIST ensemble cesium fountain clock.

Whether or not a particular measurement needs to be NIST Traceable (i.e., traceable back to the defining standard) depends on the use of the measurement. For most measurements this is not necessary and is therefore not mandated. For many pressure measurements a mercury manometer is perfectly acceptable. In many instances there is no need to correct for anything at all, not even temperature. In other cases temperature imposes enough of an error source that it must be compensated for. In yet other cases the variations in local effective gravitational constant are sufficient that they must be compensated for. It all depends on what the requirements of the measurement are.

But ALL of this is irrelevant to the discussion at hand, which is what is the definition for the measurement of pressure. There can be only ONE.
 

WBahn

Joined Mar 31, 2012
30,058
That's a source I didn't think of!
Still, is it signed by every country on the planet?
Is it legally binding in every country on the planet?
Does it rise to the level of WBahnally Official?

Is there any standard that is signed into Law by every country on the planet?

Edit: Coincidentally, those three requirements are in the correct order of stringency.;)
Where did I ever say anything about anything being legally binding on every country on the planet? I specifically talked about international treaties that were legally binding on signatory nations.
 

WBahn

Joined Mar 31, 2012
30,058
Here's the N.O.A.A. calculator for atmospheric pressure, but that's only one country.:(

http://www.srh.noaa.gov/epz/?n=wxcalc_pressureconvert

Hmmm..."WBahnally Official". I think I just coined a new phrase.:p
So... what's your point?

Not only irrelevant to the discussion since it has absolutely NOTHING to do with the DEFINITION of pressure or pressure measurement (a concept that seems almost impossible for some to grasp), but not one of those conversions even takes temperature into account.

Furthermore, the density of mercury itself varies according to where the mercury was obtained -- there are a half dozen stable isotopes ranging from an atomic mass of 196 to 204. The two most common are 200 and 202, but both are less than 30%. So the mercury used in a given barometer may well start out life with a deviation from the global average by more than 1%. Presumably manometer manufacturers take this into account when preparing their mercury for this use.

Then there's the thermal expansion, which is about 182 ppm/K at 20°C. So a measurement taken at -20°C and one taken at +40°C (assuming the thermal expansion coefficient is constant) could vary by more than 1%.

Then there's the variation in the local gravitational constant due both to the shape and composition of the Earth and also due to the rotation of the Earth. These vary by up to 0.7% over the surface of the Earth, making them comparable to the thermal expansion effects of the mercury itself.

Yet the conversion functions you link to take none of them into account.
 

Thread Starter

#12

Joined Nov 30, 2010
18,224
I was in the middle of writing about how direct measurements of physical things began the basis for all of our Constants and Standards when WBahn posted (several times). Tracing a barometer back to 1643 is irrelevant. We are not discussing the validity of Standards with respect to how they are measured or derived. We are discussing semantics.

I used the word, "Standard" in a way WBahn doesn't. For that transgression, thousands of words and elaborate arguments must be presented.
So, what's your bottom line? That I must not use the word, "Standard" unless I can prove it to the level of WBahnally Official?

Not going to happen.
If four different dictionaries call a mercury sphygmomanometer, "The Gold Standard", that's what I use until you point to a better standard (and hopefully, correct all the dictionaries).

It is clear that that you don't understand how standards work.
I specifically talked about international treaties that were legally binding on signatory nations.
True. I don't know how Standards work at the nanometer level of precision or the number of international treaty signatories you require. That is why I use known, published, Standards and Constants. I will continue to do that until you can point to a better one in any particular instance (and hopefully, correct the reference material I used to find that Standard).

By the way, have you found a better Standard for blood pressure than the mercury sphygmomanometer?
(Please answer in less than 500 words.)
 

GopherT

Joined Nov 23, 2012
8,009
It is clear that that you don't understand how standards work. Have you ever heard of the term "NIST Traceable"?
You just made coffee come out of my wife's nose.

If you have ANY instrument that involves time measurement in any way whose calibration is NIST Traceable, then it is being referred back to the definition of a second based on the NIST ensemble cesium fountain clock.
The initial arguments circled around the fact that a mercury barometer was NOT the gold standard in measuring pressure. My challenge was two-fold but apparently missed my audience. More clearly this time, what do you view as the gold standard in measuring pressure.

Instead of any answer I got an unnecessary history lesson and a description of primary standards (also in unnecessary). A definition if a meter is in no way a description of a gold standard in pressure measuring techniques.

So, if someone wants to measure pressure, what is the Gold standard technique? As far as I know, NIST does not have an "official pressure gauge" traceable back to primary standards. All they do is generate a pressure that is traceable back to primary standards. They will happily calibrate nearly any pressure gauge you want to send them based on their master pressure source (I believe they will decline a calibration request if they believe your tool will not maintain calibration for ant reasonable time.)

So, if a NIST-calibrated mercury manometer is not a gold standard of measuring pressure, what measurement technique do you suggest a tradesman or scientist use if they need a pressure measurement?

I must have come into the middle of an old argument. In any case, there is always either an answer or a misunderstanding in science - or an opinion of yet-to-be-answered scientific questions. I don't see a yet-to-be-discovered issue here so it is a question of semantics or other misunderstanding. There is definitely lack of clarity in the term "Gold Standard" and several others. I'm out.
 
Last edited:

WBahn

Joined Mar 31, 2012
30,058
As I've pointed out several times, the very phrase "gold standard" is a largely informal one. While there might be other fields where it is/was used in a quasi-official way, the one (to my understanding) in which it was used predominantly is the medical/clinical field and its use is currently deprecated even there.

Even in the clinical field "gold standard" does NOT necessarily mean the most accurate technique, merely the technique that is recommended as the reference for clinical purposes. There are MANY measurements for which the "gold standard" technique is NOT the best method from an accuracy standpoint, but from a practical standpoint those other methods are often not usable. For instance, determining whether certain diseases exist in a person can only be done via autopsy and the "gold standard" tests that can be conducted on a living person for many of those diseases have huge false positive/negative rates. As another example, the "gold standard" for body fat measurement is hydrostatic weighing, even though this is significantly less accurate than the method of stripping a cadaver apart and weighing the fat, which is the method used when possible.

I have no problem with the claim that a mercury manometer is the "gold standard" for making blood pressure measurements. Never did. Never claimed to. I had a problem, back in the original discussion, when this "gold standard" claim was used as the basis for claiming that the pressure reading of a mercury manometer was, by definition, correct and that such a measurement WAS the definition for the measurement of pressure, blood or otherwise, period. It was also claimed back then that the measurement produced by a mercury manometer was error free. Traces of this claim still exist in the blog post that started this thread.
 
Top