Monday, August 17, 2009

1.4 Understanding Measurements

1.4 Understanding Measurements
1. Sensitivity of an instrument is a measure of how small a change in the measured quantity the instrument can detect.Precision of a measurement is a measure of the consistency of the readings obtained when the quantity is measured repeatedly.Accuracy of a measurement is a measure of the closeness of the measured value to the actual value of the quantity measured.

2. All measurements have a certain amount of uncertainty. The amount of uncertainty is also known as the error in the measurement. Error does not mean mistake. It just refers to the uncertainty in a measurement.

3. Errors can be caused by

(i) the limitation due to the smallest division in the scale of the measuring instrument.


The uncertainty (error) of ±0.1 is due to the smallest division of the scale being 0.1 cm.
(ii) random errors

Random errors are caused by

• the inability of the experimenter to be consistent in repeating the measurement (this is known as personal errors)
• the non-uniformity in the measured quantity (example the diameter of a wire may be non-uniform)
• disturbances due to unstable external conditions (example, there may be wind that makes it difficult to record the measurement)In order to minimize this error, a measurement is repeated many times and an average value istaken.

4. Systematic errors are caused by

• defects in the instruments
(For example, when the magnitude of the measured quantity is zero, the pointer of the instrument may not coincide exactly with the zero mark on the scale. This is known as a zero error.)
• defects in the procedure of the experiment

Systematic errors are handled by making a correction to the measured quantity if the actual difference due to the systematic error is known.

For example, if the reading of a ruler is 4.8 cm and the zero error is —0.2 cm, then the reading is corrected by subtracting the zero error from the measuredvalue.


5. Random errors lead to a decrease in precision of the measurement.Systematic errors lead to a decrease in the accuracyof the measurement.
4.8 — (—0.2) = 5.0 cm

6. The vernier caliper is a length measuring instrument that is more sensitive than the ruler. The smallest change that can be detected with a ruler is 0.1 cm while the smallest change that can be detected with vernier caliper is 0.01 cm (0.1 mm).

7. The micrometer screw gauge is more sensitive than the vernier caliper. The smallest change that can be detected with a micrometer is 0.001 cm (0.01 mm).





1 comment:

  1. caliper is defined as a device that is used to measure the distance between two opposing sides of an object. The tips of the caliper are adjusted to fit across the points to be measured, the caliper is then removed and the distance is read by measuring between the tips with a A measuring tool, such as a ruler.

    ReplyDelete