Return Styles: Pseud0ch, Terminal, Valhalla, NES, Geocities, Blue Moon.

Pages: 1-

Scales / Precision

Name: Yo dawg 2009-04-07 19:41

Hey /sci/

How do I find the precision of a scale?

Lets say my scale can measure up to two decimal places e.g; 2.23 g, what is the precision of the instrument. Is it +/- 0.005 g? or +/- 0.001 g? Or am I completely off?

Name: Anonymous 2009-04-07 23:35

The simplest, and therefore my favourite, answer is that the precision is equal to one interval between marks on the scale.  Therefore, because your scale allows reporting to the nearest 0.01 g, the precision is 0.01 g.

Name: Anonymous 2009-04-07 23:41

The assumption is that the measurement scale is being used in the standard manner whereby the person reporting the measurement, or in the case of electronically reported vales, the device reporting the measurement, is not attempting to be more precise in the reporting than the increments would suggest possible.  The standard way a person would take a measurement is to judge no more precisely than to which mark on the scale is the object of measurement nearest to, with the custom of using half of one increment as just sufficient to round up to the next mark.  Hence we call half an increment between marks the uncertainty in reported measurement.

Name: Anonymous 2009-04-07 23:45

A reported measurement of 2.23 g could therefore have resulted from any actual mass between 2.225 and 2.235, not including the upper bound 2.235 itself, which by custom would be reported as a rounded up figure of 2.24.  So the uncertainty of any reported measurement in this case is +/- 0.005 g.  Reporting 2.23 g means reporting 2.23 +/- 0.005 g by this convention.

Name: Anonymous 2009-04-07 23:50

However, note that the quantification of uncertainty or more generally, "error", can be discussed in a more complicated manner.  More of my notes follow:

Often, the uncertainty of a measurement is found by repeating the measurement enough times to get a good estimate of the standard deviation of the values.  Then, any single value has an uncertainty equal to the standard deviation.
x ± ς

However, if the values are averaged and the mean is reported, then the averaged measurement has uncertainty equal to the standard error which is the standard deviation divided by the square root of the number of measurements.  (Recall that the standard deviation for the mean of all possible sample means is calculated that way.)
x¯  ± ς/√(n)

Statistics:  An error is a difference between a computed, estimated, or measured value and the true, specified, or theoretically correct value.

Experimental science:  An error is a bound on the precision and accuracy of the result of a measurement.  These can be classified into two types:  statistical error (see above) and systematic error.  Statistical error is caused by random (and therefore inherently unpredictable) fluctuations in the measurement apparatus, whereas systematic error is caused by an unknown but nonrandom fluctuation.  If the cause of the systematic error can be identified, then it can usually be eliminated.  Such errors can also be referred to as uncertainties.

Don't change these.
Name: Email:
Entire Thread Thread List