Calibration Fundamentals 111
The class Calibration Fundamentals provides a basic introduction to the importance of calibrating measuring instruments. Calibration determines the accuracy of measuring instruments by comparing its value to a higher-level measurement standard, usually a working standard gage block. Measurement standards follow a hierarchy consisting of primary, secondary, and working standards. Traceability links these standards together. Measurement uncertainty estimates the accuracy of a measurement. It is the range in which the true value of a measurement is expected to lie. High-accuracy parts require tight tolerances. Tighter tolerances require higher-accuracy measuring instruments. While uncertainty and error exists in every measurement, careful calibration can help to minimize inaccuracy when inspecting parts with measuring instruments. After taking this class, users should be able to explain how calibration and traceability impact the use and care of inspection devices.
Number of Lessons 15
Or fill out this form and a specialist will contact you shortly
- What is Calibration?
- The Importance of Calibration
- Measurement Standards
- Hierarchy of Measurement Standards
- ISO 9000 Requirements
- Working Standards
- Standards and Traceability
- Measurement Uncertainty
- Uncertainty vs. Error
- Random and Systematic Errors
- Gage Blocks
- Factors Affecting Calibration
- Examples of Traceability
- Describe the main purpose of calibration.
- Describe how tolerances and variation impact the necessity of calibration.
- Explain why calibration requires measurement standards.
- Identify the hierarchy of measurement standards.
- Describe the ISO 9000 calibration requirement.
- Define traceability.
- Describe the role of working standards.
- Define measurement uncertainty.
- Distinguish between uncertainty and error.
- Distinguish between random and systematic errors.
- Describe how gage blocks are used in calibration.
- Explain the importance of regular calibration.
- Identify the key factors that affect calibration.
The difference between a measurement reading and the true value of that measurement. The less error present in the measurement, the more accurate the results.
The comparison and adjustment of a device with unknown accuracy to a device with a known, accurate standard. Calibration eliminates any variation in the device being checked.
The actual change in a measurement value when the same characteristic is measured by the same operator under the same conditions at different points in time. Drift indicates how often a measurement needs recalibration.
The difference between a measured value and its correct value. Errors should be eliminated from the measuring process.
A hardened steel block manufactured with highly accurate dimensions that is used to measure part dimensions after a part is made. Gage blocks are available in a set of standardized lengths.
A set of standards published by the International Organization for Standardization. It lists the requirements for the creation and implementation of an effective quality management system.
The most recent ISO standards. It lists the current requirements for the creation and implementation of an effective quality management system.
A quick check of a measuring device against a known standard reference, such as a gage block, to determine if the device is reading the measurement correctly. Instruments should be mastered on a regular basis.
A recognized true value. Calibration must compare measurement values to a known standard.
The degree to which an instrument will repeat the same measurement over a period of time. Precision is also called repeatability, as it will show the same results under unchanged conditions.
A measurement standard with the highest quality. Its value is not referenced to higher standards.
The objectives and processes of a company designed to focus on quality and customer satisfaction. The requirements for the creation and implementation of an effective quality management system can be found in ISO 9000.
An error resulting from unpredictable variations that occur when measurements are repeated. Random errors are inconsistent and easily recognizable.
A measurement standard used in comparison with a primary standard. Also known as transfer standards, working standards are used in combination with other types of measuring machines.
An error not determined by chance but introduced by an inaccuracy in the measurement system. Systematic errors occur regularly and are corrected through calibration.
A device used with a measuring instrument that records the number of hours an instrument operates. Time meters help determine how frequently an instrument needs calibration.
An unwanted but acceptable variation or deviation from a desired dimension of a part. The object will still meet specifications.
The ability to verify the history, application, or location of an item using documentation. For an instrument's calibration to be traceable, it must follow the hierarchy of standards through to the national or international standard.
The actual value of a measurement. The true value can never be known with total certainty.
The measurement range in which the true value of a measurement is expected to lie. Uncertainty is the amount of variation that is acceptable in a measurement.
Any change or difference from the standard. A variation can signal that an error has occurred.
A measurement standard used to calibrate or master measuring instruments. Gage Blocks are often used a working standard.
A measurement standard used to calibrate or master measuring instruments. Gage blocks are often used a working standard.
To twist and rub together so that the two surfaces cling to one another. Gage blocks are wrung together in various combinations to form any length.