Commonly used terms in Instrument Specifications

Accuracy – It is the percentile agreement between the instrument and the true concentration. Accuracy may be expressed as a function of the full scale reading, a percentage of the actual reading or a specific unitary value. 
Precision – It characterizes the degree of mutual agreement among a series of individual measurements, values or results.
Accuracy vs. precision – An archery target makes a good analogy to explain the difference. Accuracy describe  the closeness of the arrows to the centre of the target. Precision describes how close the arrows are to each other. You can have a high level of precision even all of the arrows are stuck in the outer ring, as long as they are grouped closely together.
Resolution – It is the lowest concentration of the substance being measured that can be reliably detected by the instrument.
Increment of measurement – It is the least significant measurement  unit used to display reading. 
Response time – It is the time from initial exposure for the sensor to reach its final stable reading.
Recovery time – It is time necessary for the sensor to recover after exposure to a step-change in concentration.
Repeatability – It is the maximum percentage variation between repeated, independent reading on a sensor.
Linearity – It is the measure of how well the concentration response curve of an instrument fits the equation for a straight line.
Linear range – It is that portion of the concentration range over which the instrument’s concentration response matches a straight line.
Noise – It is the random fluctuation in signal that is independent of the concentration being measured.
Drift – It refers to slow or long term changes in the instrument reading that are not caused by immediate changes in the concentration of the substance being measured.