Calibration
In measurement technology and metrology, calibration is the comparison of measurement
values delivered by a device under test with those of a calibration standard of known
accuracy. Such a standard could be another measurement device of known accuracy, a
device generating the quantity to be measured such as a voltage, a sound tone, or a physical
artifact, such as a meter ruler.
The outcome of the comparison can result in one of the following:
no significant error being noted on the device under test
a significant error being noted but no adjustment made
an adjustment made to correct the error to an acceptable level
Strictly speaking, the term "calibration" means just the act of comparison and does not
include any subsequent adjustment.
The calibration standard is normally traceable to a national or international standard held
by a metrology body.
BIPM Definition
The formal definition of calibration by the International Bureau of Weights and Measures
(BIPM) is the following: "Operation that, under specified conditions, in a first step,
establishes a relation between the quantity values with measurement uncertainties
provided by measurement standards and corresponding indications with associated
measurement uncertainties (of the calibrated instrument or secondary standard) and, in a
second step, uses this information to establish a relation for obtaining a measurement
result from an indication."[1]
This definition states that the calibration process is purely a comparison, but introduces the
concept of measurement uncertainty in relating the accuracies of the device under test and
the standard.
Modern calibration processes
The increasing need for known accuracy and uncertainty and the need to have consistent
and comparable standards internationally has led to the establishment of national
laboratories. In many countries a National Metrology Institute (NMI) will exist which will
maintain primary standards of measurement (the main SI units plus a number of derived
units) which will be used to provide traceability to customer's instruments by calibration.
The NMI supports the metrological infrastructure in that country (and often others) by
establishing an unbroken chain, from the top level of standards to an instrument used for
measurement. Examples of National Metrology Institutes are NPL in the UK, NIST in the
United States, PTB in Germany and many others. Since the Mutual Recognition Agreement
was signed it is now straightforward to take traceability from any participating NMI and it
is no longer necessary for a company to obtain traceability for measurements from the NMI
of the country in which it is situated, such as the National Physical Laboratory in the UK.
Quality of calibration
To improve the quality of the calibration and have the results accepted by outside
organizations it is desirable for the calibration and subsequent measurements to be
"traceable" to the internationally defined measurement units. Establishing traceability is
accomplished by a formal comparison to a standard which is directly or indirectly related
to national standards (such as NIST in the USA), international standards, or certified
reference materials. This may be done by national standards laboratories operated by the
government or by private firms offering metrology services.
Quality management systems call for an effective metrology system which includes formal,
periodic, and documented calibration of all measuring instruments. ISO 9000[2] and ISO
17025[3] standards require that these traceable actions are to a high level and set out how
they can be quantified.
To communicate the quality of a calibration the calibration value is often accompanied by a
traceable uncertainty statement to a stated confidence level. This is evaluated through
careful uncertainty analysis. Some times a DFS (Departure From Spec) is required to
operate machinery in a degraded state. Whenever this does happen, it must be in writing
and authorized by a manager with the technical assistance of a calibration technician.
Measuring devices and instruments are categorized according to the physical quantities
they are designed to measure. These vary internationally, e.g., NIST 150-2G in the U.S.[4] and
NABL-141 in India.[5] Together, these standards cover instruments that measure various
physical quantities such as electromagnetic radiation (RF probes), sound (sound level meter
or noise dosimeter), time and frequency (intervalometer), ionizing radiation (Geiger
counter), light (light meter), mechanical quantities (limit switch, pressure gauge, pressure
switch), and, thermodynamic or thermal properties (thermometer, temperature controller).
The standard instrument for each test device varies accordingly, e.g., a dead weight tester
for pressure gauge calibration and a dry block temperature tester for temperature gauge
calibration.
Instrument calibration prompts
Calibration may be required for the following reasons:
a new instrument
after an instrument has been repaired or modified
moving from one location to another location
when a specified time period has elapsed
when a specified usage (operating hours) has elapsed
before and/or after a critical measurement
after an event, for example
after an instrument has been exposed to a shock, vibration, or physical damage, which
might potentially have compromised the integrity of its calibration
sudden changes in weather
whenever observations appear questionable or instrument indications do not match the output
of surrogate instruments
as specified by a requirement, e.g., customer specification, instrument manufacturer
recommendation.
In general use, calibration is often regarded as including the process of adjusting the
output or indication on a measurement instrument to agree with value of the applied
standard, within a specified accuracy. For example, a thermometer could be calibrated so
the error of indication or the correction is determined, and adjusted (e.g. via calibration
constants) so that it shows the true temperature in Celsius at specific points on the scale.
This is the perception of the instrument's end-user. However, very few instruments can be
adjusted to exactly match the standards they are compared to. For the vast majority of
calibrations, the calibration process is actually the comparison of an unknown to a known
and recording the results.
Basic calibration process
Purpose and scope
The calibration process begins with the design of the measuring instrument that needs to
be calibrated. The design has to be able to "hold a calibration" through its calibration
interval. In other words, the design has to be capable of measurements that are "within
engineering tolerance" when used within the stated environmental conditions over some
reasonable period of time.[6] Having a design with these characteristics increases the
likelihood of the actual measuring instruments performing as expected. Basically, the
purpose of calibration is for maintaining the quality of measurement as well as to ensure
the proper working of particular instrument.
Frequency
The exact mechanism for assigning tolerance values varies by country and as per the
industry type. The measuring of equipment is manufacturer generally assigns the
measurement tolerance, suggests a calibration interval (CI) and specifies the environmental
range of use and storage. The using organization generally assigns the actual calibration
interval, which is dependent on this specific measuring equipment's likely usage level. The
assignment of calibration intervals can be a formal process based on the results of previous
calibrations. The standards themselves are not clear on recommended CI values:[7]
ISO 17025[3]
"A calibration certificate (or calibration label) shall not contain any recommendation on
the calibration interval except where this has been agreed with the customer. This
requirement may be superseded by legal regulations.”
ANSI/NCSL Z540[8]
"...shall be calibrated or verified at periodic intervals established and maintained to
assure acceptable reliability..."
ISO-9001[2]
"Where necessary to ensure valid results, measuring equipment shall...be calibrated or
verified at specified intervals, or prior to use...”
MIL-STD-45662A[9]
"... shall be calibrated at periodic intervals established and maintained to assure
acceptable accuracy and reliability...Intervals shall be shortened or may be lengthened,
by the contractor, when the results of previous calibrations indicate that such action is
appropriate to maintain acceptable reliability."
Standards required and accuracy
The next step is defining the calibration process. The selection of a standard or standards is
the most visible part of the calibration process. Ideally, the standard has less than 1/4 of the
measurement uncertainty of the device being calibrated. When this goal is met, the
accumulated measurement uncertainty of all of the standards involved is considered to be
insignificant when the final measurement is also made with the 4:1 ratio.[10] This ratio was
probably first formalized in Handbook 52 that accompanied MIL-STD-45662A, an early US
Department of Defense metrology program specification. It was 10:1 from its inception in
the 1950s until the 1970s, when advancing technology made 10:1 impossible for most
electronic measurements.[11]
Maintaining a 4:1 accuracy ratio with modern equipment is difficult. The test equipment
being calibrated can be just as accurate as the working standard.[10] If the accuracy ratio is
less than 4:1, then the calibration tolerance can be reduced to compensate. When 1:1 is
reached, only an exact match between the standard and the device being calibrated is a
completely correct calibration. Another common method for dealing with this capability
mismatch is to reduce the accuracy of the device being calibrated.
For example, a gauge with 3% manufacturer-stated accuracy can be changed to 4% so that a
1% accuracy standard can be used at 4:1. If the gauge is used in an application requiring
16% accuracy, having the gauge accuracy reduced to 4% will not affect the accuracy of the
final measurements. This is called a limited calibration. But if the final measurement
requires 10% accuracy, then the 3% gauge never can be better than 3.3:1. Then perhaps
adjusting the calibration tolerance for the gauge would be a better solution. If the
calibration is performed at 100 units, the 1% standard would actually be anywhere
between 99 and 101 units. The acceptable values of calibrations where the test equipment is
at the 4:1 ratio would be 96 to 104 units, inclusive. Changing the acceptable range to 97 to
103 units would remove the potential contribution of all of the standards and preserve a
3.3:1 ratio. Continuing, a further change to the acceptable range to 98 to 102 restores more
than a 4:1 final ratio.
This is a simplified example. The mathematics of the example can be challenged. It is
important that whatever thinking guided this process in an actual calibration be recorded
and accessible. Informality contributes to tolerance stacks and other difficult to diagnose
post calibration problems.
Also in the example above, ideally the calibration value of 100 units would be the best point
in the gauge's range to perform a single-point calibration. It may be the manufacturer's
recommendation or it may be the way similar devices are already being calibrated.
Multiple point calibrations are also used. Depending on the device, a zero unit state, the
absence of the phenomenon being measured, may also be a calibration point. Or zero may
be resettable by the user-there are several variations possible. Again, the points to use
during calibration should be recorded.
There may be specific connection techniques between the standard and the device being
calibrated that may influence the calibration. For example, in electronic calibrations
involving analog phenomena, the impedance of the cable connections can directly
influence the result.
Manual and automatic calibrations
Calibration methods for modern devices can be manual or automatic.
As an example, a manual process may be used for
calibration of a pressure gauge. The procedure requires
multiple steps,[12] to connect the gauge under test to a
reference master gauge and an adjustable pressure source,
to apply fluid pressure to both reference and test gauges at
definite points over the span of the gauge, and to compare
the readings of the two. The gauge under test may be
adjusted to ensure its zero point and response to pressure
comply as closely as possible to the intended accuracy.
Each step of the process requires manual record keeping.
An automatic pressure calibrator [13] is a device that
combines an electronic control unit, a pressure intensifier
used to compress a gas such as Nitrogen, a pressure
transducer used to detect desired levels in a hydraulic
accumulator, and accessories such as liquid traps and
gauge fittings. An automatic system may also include data Manual calibration - US serviceman
collection facilities to automate the gathering of data for calibrating a pressure gauge. The
record keeping. device under test is on his left and
the test standard on his right.
Process description and documentation
All of the information above is collected in a calibration procedure, which is a specific test
method. These procedures capture all of the steps needed to perform a successful
calibration. The manufacturer may provide one or the organization may prepare one that
also captures all of the organization's other requirements. There are clearinghouses for
calibration procedures such as the Government-Industry Data Exchange Program (GIDEP)
in the United States.
This exact process is repeated for each of the standards
used until transfer standards, certified reference materials
and/or natural physical constants, the measurement
standards with the least uncertainty in the laboratory, are
reached. This establishes the traceability of the calibration.
See Metrology for other factors that are considered during
calibration process development.
Automatic calibration - A U.S.
After all of this, individual instruments of the specific type serviceman using a 3666C auto
pressure calibrator
discussed above can finally be calibrated. The process
generally begins with a basic damage check. Some
organizations such as nuclear power plants collect "as-found" calibration data before any
routine maintenance is performed. After routine maintenance and deficiencies detected
during calibration are addressed, an "as-left" calibration is performed.
More commonly, a calibration technician is entrusted with the entire process and signs the
calibration certificate, which documents the completion of a successful calibration. The
basic process outlined above is a difficult and expensive challenge. The cost for ordinary
equipment support is generally about 10% of the original purchase price on a yearly basis,
as a commonly accepted rule-of-thumb. Exotic devices such as scanning electron
microscopes, gas chromatograph systems and laser interferometer devices can be even
more costly to maintain.
The 'single measurement' device used in the basic calibration process description above
does exist. But, depending on the organization, the majority of the devices that need
calibration can have several ranges and many functionalities in a single instrument. A good
example is a common modern oscilloscope. There easily could be 200,000 combinations of
settings to completely calibrate and limitations on how much of an all-inclusive calibration
can be automated.
To prevent unauthorized access to an instrument tamper-
proof seals are usually applied after calibration. The
picture of the oscilloscope rack shows these, and prove
that the instrument has not been removed since it was last
calibrated as they will possible unauthorized to the
adjusting elements of the instrument. There also are labels
showing the date of the last calibration and when the
calibration interval dictates when the next one is needed.
Some organizations also assign unique identification to An instrument rack with tamper-
each instrument to standardize the record keeping and indicating seals
keep track of accessories that are integral to a specific
calibration condition.
When the instruments being calibrated are integrated with computers, the integrated
computer programs and any calibration corrections are also under control.
Historical development
Origins
The words "calibrate" and "calibration" entered the English language as recently as the
American Civil War,[14] in descriptions of artillery, thought to be derived from a
measurement of the calibre of a gun.
Some of the earliest known systems of measurement and calibration seem to have been
created between the ancient civilizations of Egypt, Mesopotamia and the Indus Valley, with
excavations revealing the use of angular gradations for construction.[15] The term
"calibration" was likely first associated with the precise division of linear distance and
angles using a dividing engine and the measurement of gravitational mass using a weighing
scale. These two forms of measurement alone and their direct derivatives supported nearly
all commerce and technology development from the earliest civilizations until about AD
1800.[16]
Calibration of weights and distances (c. 1100 CE)
Early measurement devices were direct, i.e. they had the same
units as the quantity being measured. Examples include length
using a yardstick and mass using a weighing scale. At the
beginning of the twelfth century, during the reign of Henry I
(1100-1135), it was decreed that a yard be "the distance from the
tip of the King's nose to the end of his outstretched thumb."[17]
However, it wasn't until the reign of Richard I (1197) that we find
documented evidence.[18]
Assize of Measures
"Throughout the realm there shall be the same yard of the same
size and it should be of iron."
Other standardization attempts followed, such as the Magna An example of a weighing
Carta (1225) for liquid measures, until the Mètre des Archives scale with a 1⁄2 ounce
from France and the establishment of the Metric system. calibration error at zero.
This is a "zeroing error"
which is inherently
The early calibration of pressure instruments indicated, and can normally
be adjusted by the user, but
One of the earliest pressure measurement devices was the
may be due to the string
Mercury barometer, credited to Torricelli (1643),[19] which read and rubber band in this
atmospheric pressure using Mercury. Soon after, water-filled case
manometers were designed. All these would have linear
calibrations using gravimetric principles, where the difference in
levels was proportional to pressure. The normal units of measure would be the convenient
inches of mercury or water.
In the direct reading hydrostatic manometer design on the right, applied
pressure Pa pushes the liquid down the right side of the manometer U-
tube, while a length scale next to the tube measures the difference of
levels. The resulting height difference "H" is a direct measurement of the
pressure or vacuum with respect to atmospheric pressure. In the absence
of differential pressure both levels would be equal, and this would be used
as the zero point.
The Industrial Revolution saw the adoption of "indirect" pressure
measuring devices, which were more practical than the manometer.[20]
An example is in high pressure (up to 50 psi) steam engines, where
Direct reading mercury was used to reduce the scale length to about 60 inches, but such a
design of a U- manometer was expensive and prone to damage.[21] This stimulated the
tube development of indirect reading instruments, of which the Bourdon tube
manometer invented by Eugène Bourdon is a notable example.
Indirect reading design showing a Bourdon tube from the front (left) and the rear (right).
In the front and back views of a Bourdon gauge on the right, applied pressure at the bottom
fitting reduces the curl on the flattened pipe proportionally to pressure. This moves the free
end of the tube which is linked to the pointer. The instrument would be calibrated against a
manometer, which would be the calibration standard. For measurement of indirect
quantities of pressure per unit area, the calibration uncertainty would be dependent on the
density of the manometer fluid, and the means of measuring the height difference. From
this other units such as pounds per square inch could be inferred and marked on the scale.
See also
Calibration curve
Calibrated geometry
Calibration (statistics)
Color calibration – used to calibrate a computer monitor or display.
Deadweight tester
EURAMET Association of European NMIs
Measurement Microphone Calibration
Measurement uncertainty
Musical tuning – tuning, in music, means calibrating musical instruments into playing the right
pitch.
Precision measurement equipment laboratory
Scale test car – a device used to calibrate weighing scales that weigh railroad cars.
Systems of measurement
References
1. JCGM 200:2008 International vocabulary of metrology (http://www.bipm.org/utils/common/docu
ments/jcgm/JCGM_200_2008.pdf) Archived (https://web.archive.org/web/20191031152041/http
s://www.bipm.org/utils/common/documents/jcgm/JCGM_200_2008.pdf) 2019-10-31 at the
Wayback Machine — Basic and general concepts and associated terms (VIM)
2. ISO 9001: "Quality management systems — Requirements" (2008), section 7.6.
3. ISO 17025: "General requirements for the competence of testing and calibration laboratories"
(2005), section 5.
4. Faison, C. Douglas; Brickenkamp, Carroll S. (March 2004). "Calibration Laboratories: Technical
Guide for Mechanical Measurements" (https://web.archive.org/web/20150512215554/http://w
ww.nist.gov/nvlap/upload/hb150-2g-1.pdf) (PDF). NIST Handbook 150-2G. NIST. Archived from
the original (https://www.nist.gov/nvlap/upload/hb150-2g-1.pdf) (PDF) on 12 May 2015.
Retrieved 14 June 2015.
5. "Metrology, Pressure, Thermal & Eletrotechnical Measurement and Calibration" (https://web.arc
hive.org/web/20150614205726/http://www.fcriindia.com/national-training-2/metrology-pressu
re-thermal-electrotechnical-measurement-calibration/). Fluid Control Research Institute (FCRI),
Ministry of Heavy Industries & Public Enterprises, Govt. of India. Archived from the original (htt
p://www.fcriindia.com/national-training-2/metrology-pressure-thermal-electrotechnical-measur
ement-calibration/) on 14 June 2015. Retrieved 14 June 2015.
6. Haider, Syed Imtiaz; Asif, Syed Erfan (16 February 2011). Quality Control Training Manual:
Comprehensive Training Guide for API, Finished Pharmaceutical and Biotechnologies
Laboratories (https://books.google.com/books?id=-djll_c9Z9MC&pg=PA49). CRC Press. p. 49.
ISBN 978-1-4398-4994-1.
7. Bare, Allen (2006). Simplified Calibration Interval Analysis (http://sti.srs.gov/fulltext/2006/ms20
06099.pdf) (PDF). Aiken, SC: NCSL International Workshop and Symposium, under contract
with the Office of Scientific and Technical Information, U.S. Department of Energy. pp. 1–2.
Archived (https://web.archive.org/web/20070418143512/http://sti.srs.gov/fulltext/2006/ms20
06099.pdf) (PDF) from the original on 2007-04-18. Retrieved 28 November 2014.
8. "ANSI/NCSL Z540.3-2006 (R2013)" (https://web.archive.org/web/20141120230954/http://ww
w.ncsli.org/I/i/p/z3/c/a/p/NCSL_International_Z540.3_Standard.aspx?hkey=7de83171-16ff-41
6c-9182-94c8447fb300). The National Conference of Standards Laboratories (NCSL)
International. Archived from the original (http://www.ncsli.org/I/i/p/z3/c/a/p/NCSL_Internation
al_Z540.3_Standard.aspx?hkey=7de83171-16ff-416c-9182-94c8447fb300) on 2014-11-20.
Retrieved 28 November 2014.
9. "Calibration Systems Requirements (Military Standard)" (https://web.archive.org/web/2005103
0004254/http://medivactech.com/revA.pdf) (PDF). Washington, DC: U.S. Department of
Defense. 1 August 1998. Archived from the original (http://www.medivactech.com/revA.pdf)
(PDF) on 2005-10-30. Retrieved 28 November 2014.
10. Ligowski, M.; Jabłoński, Ryszard; Tabe, M. (2011), Jabłoński, Ryszard; Březina, Tomaš (eds.),
Procedure for Calibrating Kelvin Probe Force Microscope, Mechatronics: Recent Technological
and Scientific Advances, p. 227, doi:10.1007/978-3-642-23244-2 (https://doi.org/10.1007%2F9
78-3-642-23244-2), ISBN 978-3-642-23244-2, LCCN 2011935381 (https://lccn.loc.gov/2011935
381)
11. Military Handbook: Evaluation of Contractor's Calibration System (http://www.barringer1.com/mi
l_files/MIL-HDBK-52.pdf) (PDF). U.S. Department of Defense. 17 August 1984. p. 7. Archived (h
ttps://web.archive.org/web/20141204234336/http://www.barringer1.com/mil_files/MIL-HDBK-
52.pdf) (PDF) from the original on 2014-12-04. Retrieved 28 November 2014.
12. Procedure for calibrating pressure gauges (USBR 1040) (https://www.usbr.gov/pmts/geotech/ro
ck/EMpart_2/USBR1040.pdf) (PDF). U.S. Department of the Interior, Bureau of Reclamation.
pp. 70–73. Archived (https://web.archive.org/web/20130512121612/http://www.usbr.gov/pmt
s/geotech/rock/EMpart_2/USBR1040.pdf) (PDF) from the original on 2013-05-12. Retrieved
28 November 2014.
13. "KNC Model 3666 Automatic Pressure Calibration System" (https://web.archive.org/web/20141
204112439/http://www.kingnutronics.com/Model%203666%20Automatic%20Pressure%20Cali
bration%20System.pdf) (PDF). King Nutronics Corporation. Archived from the original (http://w
ww.kingnutronics.com/Model%203666%20Automatic%20Pressure%20Calibration%20System.
pdf) (PDF) on 2014-12-04. Retrieved 28 November 2014.
14. "the definition of calibrate" (http://dictionary.reference.com/browse/calibrate). Dictionary.com.
Retrieved 18 March 2018.
15. Baber, Zaheer (1996). The Science of Empire: Scientific Knowledge, Civilization, and Colonial Rule
in India (https://books.google.com/books?id=ucDBJSxaCPYC&pg=PA14). SUNY Press. pp. 23–
24. ISBN 978-0-7914-2919-8.
16. Franceschini, Fiorenzo; Galetto, Maurizio; Maisano, Domenico; Mastrogiacomo, Luca; Pralio,
Barbara (6 June 2011). Distributed Large-Scale Dimensional Metrology: New Insights (https://bo
oks.google.com/books?id=bIwbFtXyMcMC&pg=PA117). Springer Science & Business Media.
pp. 117–118. ISBN 978-0-85729-543-9.
17. Ackroyd, Peter (16 October 2012). Foundation: The History of England from Its Earliest
Beginnings to the Tudors (https://books.google.com/books?id=Z2XHs80O0OEC&pg=PT133).
St. Martin's Press. pp. 133–134. ISBN 978-1-250-01367-5.
18. Bland, Alfred Edward; Tawney, Richard Henry (1919). English Economic History: Select
Documents (https://archive.org/details/ajc0024.0001.001.umich.edu). Macmillan Company.
pp. 154 (https://archive.org/details/ajc0024.0001.001.umich.edu/page/154)–155.
19. Tilford, Charles R (1992). "Pressure and vacuum measurements" (https://web.archive.org/web/
20141205044516/http://www.glb.nist.gov/calibrations/upload/pmc-2.pdf) (PDF). Physical
Methods of Chemistry: 106–173. Archived from the original (http://www.glb.nist.gov/calibration
s/upload/pmc-2.pdf) (PDF) on 2014-12-05. Retrieved 28 November 2014.
20. Fridman, A. E.; Sabak, Andrew; Makinen, Paul (23 November 2011). The Quality of
Measurements: A Metrological Reference (https://books.google.com/books?id=kyX-1VzPokQC&
pg=PA111). Springer Science & Business Media. pp. 10–11. ISBN 978-1-4614-1478-0.
21. Cuscó, Laurence (1998). Guide to the Measurement of Pressure and Vacuum. London: The
Institute of Measurement and Control. p. 5. ISBN 0-904457-29-X.
Sources
Crouch, Stanley & Skoog, Douglas A. (2007). Principles of Instrumental Analysis. Pacific Grove:
Brooks Cole. ISBN 0-495-01201-7.
Retrieved from "https://en.wikipedia.org/w/index.php?title=Calibration&oldid=1226922922"