The science of measurement, embracing both experimental and theoretical determinations at any level of uncertainty in any field of science and technology.
Accuracy vs. Precision
Accuracy attempts a measurement as close as possible to a known value and is the degree to which a given quantity is correct and free from error.
Precision attempts multiple measurements to be as close to each other as possible and is the number of digits used to perform a given measurement.
"Precision is measured with respect to detail and accuracy is measured with respect to reality."
"Accuracy is limited by the precision with which physical markings can be drawn, reproduced, viewed, and aligned."
And; precision is independent of accuracy:
"For example, if on average, your measurements for a given substance are close to the known value, but the measurements are far from each other, then you have accuracy without precision."
While the accuracy of a number is given by the number of significant digits to the right of the decimal point, the precision is the total number of significant digits.
The number of significant digits (or significant figures) is the number of digits needed to express the number to within the uncertainty of calculation. For example, if a quantity is known to be 1.234 ± 0.002, four figures would be significant.
- Accuracy is determined and limited by the precision with which physical markings can be created and reproduced, as well as read and aligned to the corresponding object to be measured.
- The maximum achievable precision is ± half of the smallest division of the scales.