How frequently should an instrument be calibrated?

Thread Starter

Jovex Corba

Joined May 25, 2015
1
In scientific point of view calibration is a comparison between measurements. So many instruments comes under measurements such as electrical instruments, mechanical instruments and thermal instruments.

(1) In Electro Technical Calibration, it completely deals with the electronic and electrical instruments such as the temperature controllers and pressure transmitters. In case of electro technical calibration, basically we have to simulate the electrical signal.

(2) Mechanical :- In case of mechanical, Mechanical Calibration always deals with those instruments which parts are purely mechanical or semi electrical-mechanical parts. Instruments like: Pressure Calibrator ,
BetaGauge 321A Advanced Pressure Calibrator, Pressure Gauge and more.

(3) Thermal :- In this category, those instruments which directly sense the thermal parameters in a process line. Its best example is Temperature Calibrator.

These efficiency based upon standard and accuracy. So every measurement instrument should be calibrated at regular basis.
Please give your suggestion about calibration.
 

Alec_t

Joined Sep 17, 2013
14,263
The need for and frequency of calibration would clearly depend on the use to which the instruments are put. In life-critical situations, for example, instrument accuracy might be vital and so require frequent calibration (it could even be a legal or contractual requirement). For domestic use, however, calibration may be totally unnecessary.
 

crutschow

Joined Mar 14, 2008
34,201
For commercial applications, any instruments that are used to verify the operation of a process or device must be calibrated on a regulator interval.

I worked in the aerospace industry and all instruments had to be calibrated on a regular basis by contract, the interval depending upon the type of instrument (for example, bench DMMs had a longer interval than signal generators).
There were calibration stickers on all instruments and any that were out of calibration were immediately removed from service until they were calibrated (since if a test was done with an instrument out of calibration, the test results could be invalidated).

For home use, calibration of typical lab instruments is obviously optional and likely not needed.
Modern solid-state instruments are so stable that it is likely many years (if not decades) before they significantly drift out of their calibration limits.
 

nsaspook

Joined Aug 27, 2009
12,998
For home use technical electrical calibration to a primary standard on a regular time schedule is usually not needed but I usually include and log a secondary standard in most measurement systems that collect long term data to check for measurement system drift.

The Repeatability and stability of results is often more important than the absolute value of the measurement because I can go back and correct the recorded values to the actual values later if everything is perfectly stable but inaccurate. In a complex process with many factors there is usually a 'Golden machine' that makes a "Golden wafer' (that's usually analyzed by a external lab) that is the standard for other processes. Once that 'Golden' process is standardized other machines are adjusted to produce products just like that one by adjusting parameters within limits. So the needed calibration period of each type of measurement device usually depends on it's stability of measurements not the absolute value at the time of calibration. The measurement of electrical properties with stable devices like a DVM is easy so we usually like to create quality checks that look for those type of measurements by proxy of high precision pressure, mechnical or temperature measurements as those are usually adjusted within limits to make the electrical properties correct and in control.

Example:
http://www.nordson.com/en-us/divisi...Accuracy-and-Repeatability-NordsonASYMTEK.pdf
 
Last edited:

crutschow

Joined Mar 14, 2008
34,201
.......................
The Repeatability and stability of results is often more important than the absolute value of the measurement because I can go back and correct the recorded values to the actual values later if everything is perfectly stable but inaccurate.......................
That reminds me of the Hubble Telescope mirror error. Due to the incorrect assembly of a null corrector used to verify the shape of the mirror during grinding, the mirror was ground incorrectly but very accurately to a faulty shape. Because of this accuracy they were able to build and install a precise secondary correction lens that compensated for this error, allowing the Hubble to meet it's original resolution specs and take the fantastically detailed pictures of the cosmos we've all seen.
 

wayneh

Joined Sep 9, 2010
17,495
Please give your suggestion about calibration.
My suggestion is to follow the calibration recommendations of the instrument manufacturer. They'll have a good idea what the frequency should be in order to meet the desired level of accuracy.

Lacking that expert guidance, you can establish a calibration schedule by noting how frequently and by how much the instrument is found to be out of calibration. Then consult a statistical textbook to calculate the ideal sampling (calibration) frequency to meet your needs.

Don't forget the distinction between the instrument and its attached sensors or probes.
 

BillB3857

Joined Feb 28, 2009
2,570
I managed the maintenance calibration lab for an aerospace manufacturer. Some equipment was on a fixed time table and others on a variable time table. For the variable, the condition at the time of calibration was noted. The unit was either "in calibration", "out of calibration", or "significantly out of calibration". If "in calibration", the schedule was extended from the norm. If "out of calibration", the normal schedule was maintained. Units the were "significantly out of calibration" had their schedule for recalibration shortened.
 
Top