Calibration Fundamentals 111

This class provides a basic introduction to the importance of calibrating measuring instruments. Calibration determines the accuracy of measuring instruments by comparing its value to a higher-level measurement standard, usually a working standard gage block. Measurement standards follow a hierarchy consisting of primary, secondary, and working standards and traceability links these standards together. Measurement uncertainty estimates the accuracy of a measurement and provides the range in which the true value of a measurement is expected to lie.

While uncertainty and errors exist in every measurement, careful calibration can help to minimize inaccuracy when inspecting parts with measuring instruments. After taking this class, users should be able to explain how calibration and traceability impact the use and care of inspection devices.

  • Difficulty Beginner

  • Format Online

  • Number of Lessons 15

  • Language English


Or fill out this form and a specialist will contact you shortly

Course Outline
  • Calibration
  • The Importance of Calibration
  • Measurement Standards
  • Calibration and Measurement Standards
  • ISO 9000 Requirements
  • Traceability
  • Working Standards
  • Standards and Traceability
  • Measurement Uncertainty
  • Uncertainty vs. Error
  • Random and Systematic Errors
  • Gage Blocks
  • Calibration and Mastering Frequency
  • Factors Affecting Calibration
  • Calibration Review
  • Describe calibration.
  • Describe how tolerances and variation impact the necessity of calibration.
  • Describe measurement standards for calibration.
  • Describe ISO 9000 calibration and certification requirements.
  • Describe traceability.
  • Describe the role of working standards.
  • Describe measurement uncertainty.
  • Distinguish between uncertainty and error.
  • Distinguish between random and systematic errors.
  • Describe how gage blocks are used in calibration.
  • Describe the importance of regular calibration.
  • Describe the key factors that affect calibration.
Vocabulary Term

acceptance zone

A tolerance determined by subtracting measurement uncertainty from the original tolerance. Acceptance zones prevent out-of-tolerance parts from passing measurement or inspection.


The difference between a measurement reading and the true value of that measurement. The less error present in the measurement, the more accurate the results.


The comparison and adjustment of a device with unknown accuracy to a device with a known, accurate standard. Calibration eliminates any variation in the device being checked.


The actual change in a measurement value when the same characteristic is measured by the same operator under the same conditions at different points in time. Drift indicates how often a measurement needs recalibration.


The difference between a measured value and its correct value. Errors should be eliminated from the measuring process.

gage blocks

A hardened steel block manufactured with highly accurate dimensions that is used to measure part dimensions or calibrate other measurement devices. Gage blocks are available in a set of standardized lengths.

IATF 16949:2016

A revised version of standard ISO/TS 16949 developed by the International Automotive Task Force (IATF) to improve international standards in the automotive industry. IATF 16949:2016 includes additional input from a range of organizations in the automotive industry and throughout the automotive supply chain.

ISO 9000

A series of quality assurance standards published by the International Organization for Standardization. ISO 9000 is intended to guide an organization on the implementation and continual improvement of quality management.

ISO 9001:2015

The core standard of ISO 9000 that contains the requirements an auditor uses to verify conformity of a QMS. ISO 9001:2015 is titled 'Quality Management Systems Requirements' and presents the actual material to which a company is certified.

ISO/TS 16949

An automotive supply chain standard developed by the International Automotive Task Force (IATF). ISO/TS 16949 was intended to improve and standardize international automotive supply chain certification and was eventually revised as IATF 16949:2016.


A quick check of a measuring device against a known standard reference, such as a gage block, to determine if the device is providing accurate measurements. Mastering should be done regularly for measuring devices.

measurement standard

A recognized true value. Calibration must compare measurement values to a known standard.


The degree to which an instrument will repeat the same measurement over a period of time. Precision is also called repeatability, as it will show the same results under unchanged conditions.

primary standards

A measurement standard with the highest quality used in a given nation. Primary standards are usually either national standards or international standards, depending on the nation and the context in which the standard is used.

primary standards

A measurement standard with the highest quality. Its value is not referenced to higher standards.

quality system

The objectives and processes of a company designed to focus on quality and customer satisfaction. Quality system requirements for creation and implementation can be found in ISO 9001.

random errors

An error resulting from unpredictable variations that occur when measurements are repeated. Random errors are inconsistent and easily recognizable.

secondary standards

A measurement standard used in comparison with a primary standard. Also known as transfer standards, working standards are used in combination with other types of measuring machines.

secondary standards

A measurement standard used in comparison with a primary standard. Secondary standards are also known as transfer standards and are used as reference for working standards.

systematic errors

An error not determined by chance but introduced by an inaccuracy in the measurement system. Systematic errors occur regularly and are corrected through calibration.

time meters

A device used with a measuring instrument that records the number of hours an instrument operates. Time meters help determine how frequently an instrument needs calibration.


An acceptable deviation from a desired dimension that still meets part specifications. Tolerances indicate the allowable difference between a physical feature and its intended design.


The ability to verify the history, application, or location of an item using documentation. For an instrument's calibration to be traceable, it must follow the hierarchy of standards through to the national or international standard.

true value

The actual value of a measurement. The true value can never be known with total certainty.


The measurement range in which the true value of a measurement is expected to lie. Uncertainty is the amount of variation that is acceptable in a measurement.


Any change or difference from the standard. A variation can signal that an error has occurred, but some measurement variation is unavoidable.

working standard

A measurement standard used to calibrate or master measuring instruments. Working standards often use gage blocks for calibration.

working standards

A measurement standard used to calibrate or master measuring instruments. Gage blocks are often used a working standard.

working standards

A measurement standard used to calibrate or master measuring instruments. Working standards often use gage blocks for calibration.


To twist and rub together so that two surfaces cling to one another. Wringing gage blocks creates various combinations to form any length for measurement.