Sigma is out, standard deviation is the way to go!
The distinction between sigma (σ) and ‘s’ as representing the standard deviation of a normal distribution is simply that sigma (σ) signifies the idealised population standard deviation derived from an infinite number of measurements, whereas ‘s’ represents the sample standard deviation derived from a finite number of measurements.
Examples of good and bad practice
Bad practice
The external reproducibility (2 SD) obtained ...
The 2σ error ..."
Uncertainties shown are at the 1 standard deviation level (i.e., 68.3 % confidence) ...
Good practice
The intermediate precision expressed as 2s obtained ...
The 2s precision ...
Uncertainties shown are at the 1s level (i.e., 68.3 % confidence) ...
Commentary
Common statistical practice defines an ideal normal distribution as comprising an infinite number of measurements, characterised by a population mean (µ), with a dispersion defined by a population standard deviation (σ). Under these ideal conditions, 68.27% of the data distribution lies within the limits (µ ± σ ), 95.45% within (µ ± 2σ ) and 99.73% within (µ ± 3σ ).
Of course, in the real word, distributions of data are defined by a finite number of elements. To make this distinction, the sample mean (from a finite number of measurements) is distinguished from the population mean (from an infinite number of measurements) by the symbol ‘x̅’ in place of ‘µ’, and the sample standard deviation from the population standard deviation by the symbol ‘s’ in place of ‘σ’.
Thus, the sample mean (x̅) is an estimate of the population mean (µ), and the sample standard deviation (s) is an estimate of the population standard deviation (σ).
Thus the symbol ‘σ‘ is therefore reserved for ideal normal distributions comprising an infinite number of measurements.
Index to terms
Concept 
Metrological terms considered 
Metrological terms covered 
The ambiguities associated with the use of ‘ppm’ 


What symbols are used to represent the properties of population and sampledistributions 


Avoiding the use of the term standards when referring to (certified) reference materials or calibrators 
Reference material, certified reference material, standard reference material, calibrator, calibration, validation, measurement standard (étalon), verification 

Explaining the difference between uncertainty and error 
Measurement uncertainty, measurement error, systematic measurement error, measurement bias, random measurement error, confidence level 

Distinguishing between repeatability, intermediate precision and reproducibility and discouraging the use of ‘internal precision’ 
Measurement precision, repeatability condition of measurement, intermediate precision of measurement, reproducibility condition of measurement, repeatability, intermediate precision, reproducibility 

Explaining the difference between accuracy, bias and trueness 
Measurement accuracy, measurement trueness, measurement bias 
> Download our Glossary Leaflet here. 