Micrometer


A micrometer, sometimes known as a micrometer screw gauge, is a device incorporating a calibrated screw widely used for accurate measurement of components in mechanical engineering and machining as well as most mechanical trades, along with other metrological instruments such as dial, vernier, and digital calipers. Micrometers are usually, but not always, in the form of calipers. The spindle is a very accurately machined screw and the object to be measured is placed between the spindle and the anvil. The spindle is moved by turning the ratchet knob or thimble until the object to be measured is lightly touched by both the spindle and the anvil.
Micrometers are also used in telescopes or microscopes to measure the apparent diameter of celestial bodies or microscopic objects. The micrometer used with a telescope was invented about 1638 by William Gascoigne, an English astronomer.

History of the device and its name

The word micrometer is a neoclassical coinage. The Merriam-Webster Collegiate Dictionary says that English got it from French and that its first known appearance in English writing was in 1670. Neither the metre nor the micrometre nor the micrometer as we know them today existed at that time. However, the people of that time did have much need for, and interest in, the ability to measure small things and small differences. The word was no doubt coined in reference to this endeavor, even if it did not refer specifically to its present-day senses.
The first ever micrometric screw was invented by William Gascoigne in the 17th century, as an enhancement of the vernier; it was used in a telescope to measure angular distances between stars and the relative sizes of celestial objects.
Henry Maudslay built a bench micrometer in the early 19th century that was nicknamed "the Lord Chancellor" among his staff because it was the final judge on measurement accuracy and precision in the firm's work. In 1844, details of Whitworth's workshop micrometer were published. This was described as having a strong frame of cast iron, the opposite ends of which were two highly finished steel cylinders, which traversed longitudinally by action of screws. The ends of the cylinders where they met was of hemispherical shape. One screw was fitted with a wheel graduated to measure to the ten thousandth of an inch. His object was to furnish ordinary mechanics with an instrument which, while it afforded very accurate indications, was yet not very liable to be deranged by the rough handling of the workshop.
The first documented development of handheld micrometer-screw calipers was by Jean Laurent Palmer of Paris in 1848; the device is therefore often called palmer in French, tornillo de Palmer in Spanish, and calibro Palmer in Italian. The micrometer caliper was introduced to the mass market in anglophone countries by Brown & Sharpe in 1867, allowing the penetration of the instrument's use into the average machine shop. Brown & Sharpe were inspired by several earlier devices, one of them being Palmer's design. In 1888, Edward W. Morley added to the precision of micrometric measurements and proved their accuracy in a complex series of experiments.
The culture of toolroom accuracy and precision, which started with interchangeability pioneers including Gribeauval, Tousard, North, Hall, Whitney, and Colt, and continued through leaders such as Maudslay, Palmer, Whitworth, Brown, Sharpe, Pratt, Whitney, Leland, and others, grew during the Machine Age to become an important part of combining applied science with technology. Beginning in the early 20th century, one could no longer truly master tool and die making, machine tool building, or engineering without some knowledge of the science of metrology, as well as the sciences of chemistry and physics.

Types

Basic types

Specialized types

Each type of micrometer caliper can be fitted with specialized anvils and spindle tips for particular measuring tasks. For example, the anvil may be shaped in the form of a segment of screw thread, in the form of a v-block, or in the form of a large disc.
Micrometers use the screw to transform small distances into large rotations of the screw that are big enough to read from a scale. The accuracy of a micrometer derives from the accuracy of the thread-forms that are central to the core of its design. In some cases it is a differential screw. The basic operating principles of a micrometer are as follows:
  1. The amount of rotation of an accurately made screw can be directly and precisely correlated to a certain amount of axial movement, through the constant known as the screw's lead. A screw's lead is the distance it moves forward axially with one complete turn.
  2. With an appropriate lead and major diameter of the screw, a given amount of axial movement will be amplified in the resulting circumferential movement.
For example, if the lead of a screw is 1 mm, but the major diameter is 10 mm, then the circumference of the screw is 10π, or about 31.4 mm. Therefore, an axial movement of 1 mm is amplified to a circumferential movement of 31.4 mm. This amplification allows a small difference in the sizes of two similar measured objects to correlate to a larger difference in the position of a micrometer's thimble. In some micrometers, even greater accuracy is obtained by using a differential screw adjuster to move the thimble in much smaller increments than a single thread would allow.
In classic-style analog micrometers, the position of the thimble is read directly from scale markings on the thimble and sleeve. A vernier scale is often included, which allows the position to be read to a fraction of the smallest scale mark. In digital micrometers, an electronic readout displays the length digitally on an LCD on the instrument. There also exist mechanical-digit versions, like the style of car odometers where :File:Odometer2.jpg|the numbers "roll over".

Parts

A micrometer is composed of:
; Frame: The C-shaped body that holds the anvil and barrel in constant relation to each other. It is thick because it needs to minimize flexion, expansion, and contraction, which would distort the measurement.The frame is heavy and consequently has a high thermal mass, to prevent substantial heating up by the holding hand/fingers. It is often covered by insulating plastic plates which further reduce heat transference.Explanation: if one holds the frame long enough so that it heats up by 10 °C, then the increase in length of any 10 cm linear piece of steel is of magnitude 1/100 mm. For micrometers this is their typical accuracy range.Micrometers typically have a specified temperature at which the measurement is correct. Toolrooms are generally kept at 20 °C .
; Anvil: The shiny part that the spindle moves toward, and that the sample rests against.
; Sleeve, barrel, or stock: The stationary round component with the linear scale on it, sometimes with vernier markings. In some instruments the scale is marked on a tight-fitting but movable cylindrical sleeve fitting over the internal fixed barrel. This allows zeroing to be done by slightly altering the position of the sleeve.
; Lock nut, lock-ring, or thimble lock: The knurled component that one can tighten to hold the spindle stationary, such as when momentarily holding a measurement.
; Screw: The heart of the micrometer, as explained under [|"Operating principles"]. It is inside the barrel. This references the fact that the usual name for the device in German is Messschraube, literally "measuring screw".
; Spindle: The shiny cylindrical component that the thimble causes to move toward the anvil.
; Thimble: The component that one's thumb turns. Graduated markings.
; Ratchet stop: Device on end of handle that limits applied pressure by slipping at a calibrated torque.

Reading

Customary/Imperial system

The spindle of a micrometer graduated for the Imperial and US customary measurement systems has 40 threads per inch, so that one turn moves the spindle axially 0.025 inch, equal to the distance between adjacent graduations on the sleeve. The 25 graduations on the thimble allow the 0.025 inch to be further divided, so that turning the thimble through one division moves the spindle axially 0.001 inch. Thus, the reading is given by the number of whole divisions that are visible on the scale of the sleeve, multiplied by 25, plus the number of that division on the thimble which coincides with the axial zero line on the sleeve. The result will be the diameter expressed in thousandths of an inch. As the numbers 1, 2, 3, etc., appear below every fourth sub-division on the sleeve, indicating hundreds of thousandths, the reading can easily be taken.
Suppose the thimble were screwed out so that graduation 2, and three additional sub-divisions, were visible on the sleeve, and that graduation 1 on the thimble coincided with the axial line on the sleeve. The reading would then be 0.2000 + 0.075 + 0.001, or 0.276 inch.

Metric system

The spindle of an ordinary metric micrometer has 2 threads per millimetre, and thus one complete revolution moves the spindle through a distance of 0.5 millimeter. The longitudinal line on the sleeve is graduated with 1 millimetre divisions and 0.5 millimetre subdivisions. The thimble has 50 graduations, each being 0.01 millimetre. Thus, the reading is given by the number of millimetre divisions visible on the scale of the sleeve plus the particular division on the thimble which coincides with the axial line on the sleeve.
Suppose that the thimble were screwed out so that graduation 5, and one additional 0.5 subdivision were visible on the sleeve, and that graduation 28 on the thimble coincided with the axial line on the sleeve. The reading then would be 5.00 + 0.5 + 0.28 = 5.78 mm.

Vernier micrometers

Some micrometers are provided with a vernier scale on the sleeve in addition to the regular graduations. These permit measurements within 0.001millimetre to be made on metric micrometers, or 0.0001 inches on inch-system micrometers.
The additional digit of these micrometers is obtained by finding the line on the sleeve vernier scale which exactly coincides with one on the thimble. The number of this coinciding vernier line represents the additional digit.
Thus, the reading for metric micrometers of this type is the number of whole millimeters and the number of hundredths of a millimeter, as with an ordinary micrometer, and the number of thousandths of a millimeter given by the coinciding vernier line on the sleeve vernier scale.
For example, a measurement of 5.783millimetres would be obtained by reading 5.5millimetres on the sleeve, and then adding 0.28millimetre as determined by the thimble. The vernier would then be used to read the 0.003.
Inch micrometers are read in a similar fashion.
Note: 0.01 millimeter = 0.000393 inch, and 0.002millimeter = 0.000078 inch or alternatively, 0.0001 inch = 0.00254millimeters. Therefore, metric micrometers provide smaller measuring increments than comparable inch unit micrometers—the smallest graduation of an ordinary inch reading micrometer is 0.001 inch; the vernier type has graduations down to 0.0001 inch. When using either a metric or inch micrometer, without a vernier, smaller readings than those graduated may of course be obtained by visual interpolation between graduations.

Torque repeatability via torque-limiting ratchets or sleeves

A micrometer reading is not accurate if the thimble is over- or under-torqued. A useful feature of many micrometers is the inclusion of a torque-limiting device on the thimble—either a spring-loaded ratchet or a friction sleeve. Without this device, workers may overtighten the micrometer on the work, causing the mechanical advantage of the screw to tighten the screw threads or squeeze the material, giving an inaccurate measurement. However, with a thimble that will ratchet or friction slip at a certain torque, the micrometer will not continue to advance once sufficient resistance is encountered. This results in greater accuracy and repeatability of measurements—most especially for low-skilled or semi-skilled workers, who may not have developed the light, consistent touch of a skilled user.
It might seem that there would be no such thing as too little torque on the thimble, because if zero tightening of the threads is the goal, then the less torque, the better. However, there is a practical limit on this ideal. Some tiny amount of torque, although very slight, is involved in the normal hand movements of well-practiced micrometer use. It is light but not truly zero, because zero is impractical for a skillful feel of how the contact is being made. And the reflects this amount, as tiny as it is. If one then changes to an "afraid to even touch it" sort of, one is being inconsistent with the norm that the calibration reflects, resulting in a reading that is 1 to 3 tenths too big.
Related to this torque topic is interuser variation in what is normal. It is important to try not to have an touch, because although it works perfectly well for intrauser consistency, it interferes with interuser consistency. Some people use a rather heavy touch as a matter of habit, and this is fine in that they can get highly accurate readings as long as they calibrate their micrometer accordingly. The problem arises when they use someone else's micrometer, or when someone uses theirs. The heavy-touch user gets false-small readings, and the normal-touch user gets false-big readings. This may not arise in one-person shops, but teams of workers sharing company-owned instruments must be capable of interpersonal consistency to do close-tolerance work successfully. There is a good and easy way to synchronize on this topic: it is simply to get used to the "feel" of how much torque it takes to slip the typical friction sleeve or click the typical ratchet thimble—and then incorporate that same feel into every use of a micrometer, even those that have no sleeve or ratchet. This is proper training for the machining trade, although it is not uncommon to encounter coworkers who were not well trained on this point. In many cases it seems that in drilling the "don't overtorque" idea into trainees' heads, an opposite extreme is mistakenly taught, where the user thinks the goal is to compete with everyone else on who can generate the lightest touch. Individuals naturally differ in their touch, so such a competition is not as effective at generating interuser consistency as is "imagining that every thimble has a sleeve to slip."
Bench micrometers of the "super-mic" class entirely removes this interuser variation by having the user dial the handwheel until a needle reads zero on a gauge, producing the same pressure on every reading.

Calibration: testing and adjusting

Zeroing

On most micrometers, a small pin spanner is used to turn the sleeve relative to the barrel, so that its zero line is repositioned relative to the markings on the thimble. There is usually a small hole in the sleeve to accept the spanner's pin. This calibration procedure will cancel a zero error: the problem that the micrometer reads nonzero when its jaws are closed.

Testing

A standard one-inch micrometer has readout divisions of 0.001 inch and a rated accuracy of ±0.0001 inch.
Both the measuring instrument and the object being measured should be at room temperature for an accurate measurement; dirt, abuse, and low operator skill are the main sources of error.
The accuracy of micrometers is checked by using them to measure gauge blocks, rods, or similar standards whose lengths are precisely and accurately known. If the gauge block is known to be 0.75000±0.00005inch, then the micrometer should measure it as 0.7500inch. If the micrometer measures 0.7503inch, then it is out of calibration. Cleanliness and low torque are especially important when calibrating—each tenth, or hundredth of a millimeter, "counts"; each is important. A mere speck of dirt, or a mere bit too much squeeze, obscure the truth of whether the instrument is able to read correctly. The solution is simply —cleaning, patience, due care and attention, and repeated measurements.
Calibration typically checks the error at 3 to 5 points along the range. Only one can be adjusted to zero. If the micrometer is in good condition, then they are all so near to zero that the instrument seems to read essentially "-on" all along its range; no noticeable error is seen at any locale. In contrast, on a worn-out micrometer, one can "chase the error up and down the range", that is, move it up or down to any of various locales along the range, by adjusting the sleeve, but one cannot eliminate it from all locales at once.
Calibration can also include the condition of the tips, any ratchet, and linearity of the scale. Flatness and parallelism are typically measured with a gauge called an optical flat, a disc of glass or plastic ground with extreme accuracy to have flat, parallel faces, which allows light bands to be counted when the micrometer's anvil and spindle are against it, revealing their amount of geometric inaccuracy.
Commercial machine shops, especially those that do certain categories of work, are required by various standards organizations to calibrate micrometers and other gauges on a schedule, to affix a label to each gauge that gives it an ID number and a calibration expiration date, to keep a record of all the gauges by ID number, and to specify in inspection reports which gauge was used for a particular measurement.
Not all calibration is an affair for metrology labs. A micrometer can be calibrated on-site anytime, at least in the most basic and important way, by measuring a high-grade gauge block and adjusting to match. Even gauges that are calibrated annually and within their expiration timeframe should be checked this way every month or two, if they are used daily. They usually will check out OK as needing no adjustment.
The accuracy of the gauge blocks themselves is traceable through a chain of comparisons back to a master standard such as the international prototype of the meter. This bar of metal, like the :File:CGKilogram.jpg|international prototype of the kilogram, is maintained under controlled conditions at the International Bureau of Weights and Measures headquarters in France, which is one of the principal measurement standards laboratories of the world. These master standards have extreme-accuracy regional copies, and metrological equipment makes the chain of comparisons. Because the definition of the meter is now based on a light wavelength, the international prototype of the meter is not quite as indispensable as it once was. But such master gauges are still important for calibrating and certifying metrological equipment. Equipment described as "NIST traceable" means that its comparison against master gauges, and their comparison against others, can be traced back through a chain of documentation to equipment in the NIST labs. Maintaining this degree of traceability requires some expense, which is why NIST-traceable equipment is more expensive than non-NIST-traceable. But applications needing the highest degree of quality control mandate the cost.

Adjustment

A micrometer that has been zeroed and tested and found to be off might be restored to accuracy by further adjustment. If the error originates from the parts of the micrometer being worn out of shape and size, then restoration of accuracy by this means is not possible; rather, repair is required. For standard kinds of instruments, in practice it is easier and faster, and often no more expensive, to buy a new one rather than pursue refurbishment