BELIEVE   ME   NOT!    - -     A   SKEPTIC's   GUIDE  

next up previous
Next: Graphs and Error Bars Up: Measurement Previous: Measurement

Tolerance

(Advertising Your Uncertainty)

Virtually all [I could follow the consensus and say all, but I feel like hedging] "scientific" procedures involve measurement of experimental parameters such as distance, time, velocity, mass, energy, temperature, . . . etc. Virtually all measurements are subject to error; that is, they may be inaccurate (wrong) by some unknown amount due to effects ranging from errors in recording ["I said 3.32, not 3.23!"] to miscalibrated instruments ["I thought these tic marks were centimetres!"]. Such "systematic errors" are embarrassing to the experimenter, as they imply poor technique, and are always hard to estimate; but we are honour-bound to try. An entirely different source of error that conveys no negative connotations on the experimenter is the fact that all measurements have limited precision or "tolerance" - limited by the "marks" on the generalized "ruler" used for measuring-by-comparison. (E.g., the distance your measure with a micrometer is more precisely known than the distance you measure with a cloth tape measure.)

Knowing this, most scientists and virtually all physicists have an æsthetic about measured values of things: they are never to be reported without an explicit estimation of their uncertainty. That is, measurements must always be reported in the form

\begin{displaymath}\hbox{\rm (VALUE $\pm$ UNCERTAINTY) UNITS } \end{displaymath}

or equivalent notation (sometimes a shorthand version), such as 3.1416(12) radians, meaning (3.1416 $\pm$ 0.0012) radians. [The (12) means the uncertainty in the last two digits is $\pm$ 12.] This shorthand form is convenient for long strings of digits with only the last 1 or 2 digits uncertain, but the explicit form with the $\pm$ is more pleasing to the æsthetic mentioned above.

When, as in some elementary particle physics experiments lasting many years and costing millions of dollars, a great deal of effort has gone into measuring a single number, it is common practice to make a clear distinction between "statistical errors" (the precision of our instrumentation) and suspected "systematic errors" (mistakes). In most situations, however, both are lumped together or "added in quadrature" (the total uncertainty is the square root of the sum of the squares of the uncertainties due to all the independent sources of error).5.1 It is considered poor form to cavalierly overestimate one's uncertainty to reduce the significance of deviations from expectations.

To write a measured value without its tolerance (uncertainty, "possible error," etc.) is as bad form as leaving out the units of the measurement. The significance of your measurement is lost. To do this in the presence of physicists is like ordering Ripple with your meal at Maxim's. Sadly, values are slipping throughout society, and otherwise respectable scientists can often be heard to quote numbers without specifying uncertainties. The best we can do is to be sure we do not contribute to this decay.



 
next up previous
Next: Graphs and Error Bars Up: Measurement Previous: Measurement
Jess H. Brewer - Last modified: Fri Nov 13 16:26:03 PST 2015