What is a multifunction process calibrator ?

Thread Starter


Joined Aug 15, 2009

Could you kindly shed some light on "What is Multifunction Process Calibrator ?" or point me to useful links . Surprisingly google turns up little information on the topic other than vendor information. There are quite a few devices sold on the market both by well known manufacturers like Fluke and others such as https://www.brightwinelectronics.com/brt-lb01-lb02-process-calibrator-user-manual.html . As far as I can see it is not any different from a combined Voltage, Current and Resistance standard, although the cheap ones do not seem to have metrology grade precision. However I wonder if these are good enough to calibrate a multimeter or an oscilloscope ?



Joined Aug 27, 2009

Most the these types of devices are designed for maintenance functions (scheduled and unscheduled) on industrial control systems inside a manufacturing facility. I would say that most are good enough to check if limited functions on a multi meter or oscilloscope are within standards. The calibration of precision instruments requires specialized equipment for that function.

Last edited:


Joined Jan 15, 2015
However I wonder if these are good enough to calibrate a multimeter or an oscilloscope ?
With that in mind I guess you need to look at the TUR (Test Uncertainty Ratio) you want or possibly need to calibrate whatever you want to calibrate. If you have a 1% uncertainty and want a 4:1 TUR you want your standard to have a 0.25% uncertainty. So for a standard how good is good enough?



Joined Sep 24, 2015
In Metrology, when calibrating any tool or test equipment - the Unit Under Calibration (UUC) is calibrated to a known standard that is 10 times more accurate than the UUC. For instance, when calibrating a micrometer that measures to the nearest 0.001 the tool used to check the calibration should be accurate to the nearest 0.0001 (a factor of 10). I would assume that when calibrating a multimeter that is accurate to 0.01 the tool used to calibrate it should be accurate to 0.001 (volts, ohms, amps - whatever).

That's industry standard and in accordance with ISO standards. You can calibrate a rope to a piece of string if you so desire. The level of accuracy will depend on what you are calibrating it too.


Joined Jan 15, 2015
Tony, while I agree a 10:1 TUR is nice to have my reasoning in mentioning a 4:1 TUR was based on the following:

"Some quality standards attempt to define what this ratio should be. ANSI/NCSL Z540-1-1994 states “The laboratory shall ensure that calibration uncertainties are sufficiently small so that the adequacy of the measurement is not affected” It also states “Collective uncertainty of the measurement standards shall not exceed 25% of the acceptable tolerance (e.g. Manufacturer specifications)”. This 25% equates to a TUR of 4:1. Other quality standards have recommended TUR's as high as 10:1. For some, a TUR of 3:1, 2:1 or even 1:1 is acceptable. Any of these may be acceptable to a specific user who understands the risks that are involved with lower TUR's or builds these into his/her measurement process. When accepting a TUR less than 4:1, it is important to consider the UUT's tolerance band where its “As Found” reading is determined to lie and more important, where the UUT is left during the calibration process".

In some cases the TUR is also determined by "State of the Art" and that is where the 1:1 ratio is derived from. Additionally I am aware that ANSI/NCSL Z540-1-1994 has been superseded but I haven't been involved directly with Metrology since the early 90s. I doubt how they define an acceptable TUR has changed much. Least I forget the above quote came from here. During my younger days I was afforded a nice visit to then NBS (National Bureau Standards) now NIST and was an active with the NCSL who helped write much of what is found in the old Mil-Std-45662.

All that aside unless the original poster is wanting to meet compliance with some standard(s) my best guess an acceptable TUR is anything that works. :)