Errors are most frequently modeled as belonging to a Gaussian, or "bell curve" distribution. In this approximation each quantity is fully characterized by only two parameters: the central value, or mean, and the deviation, or uncertainty. The mean indicates the most likely value, while the deviation tells how far the actual value is likely to be spread around the mean. In a pure Gaussian distribution there is about a 32% chance that the value is more than one deviation away from the mean, a 4.5% chance that the value is more than two deviations away from the mean, and a 0.3% chance that the value is more than three deviations away from the mean. When the deviation is zero then the distribution is 100% certain to have the value of the mean.
The probability density for a Gaussian distribution is proportional to
![]()
One reason for using the Gaussian model is how well it matches many real distributions. In fact, the Central Limit Theorem guarantees that for any distribution with a mean and a deviation, the sum of N variables with this distribution will become more and more like a Gaussian distribution as N gets larger.
The other reason for the popularity of the Gaussian model is its computational simplicity. The sum of two variables with Gaussian distributions has a Gaussian distribution. The distribution is smooth and differentiable. It even Fourier transforms into another Gaussian distribution.
Gaussian Drawbacks
The Gaussian model is not always good enough. Many real-world distributions are not Gaussian. For example, time read from a perfectly accurate system clock has uniformly distributed error between two consecutive ticks. (If the resolution is seconds, then a reading taken at 12:00:00 is equally likely to be 12:00:00.01, 12:00:00.50, and 12:00:00.99, but absolutely will not be 11:59:59:99 or 12:00:01.00.)
One particularly limiting problem with the Gaussian model is that its "tails" (the edges of the distribution) are infinite. There is a small but finite chance that a variable's value is, say, 100 deviations away from the mean. Infinite tails cannot model cases in which the distribution must have a limit. I may say that a block is 1.0 +/- 0.1 inches wide, but the Gaussian interpretation of this statement allows the block to have a negative width
Expanding the Gaussian model
The mean is sometimes referred to as the first moment of a distribution. It is calculated from a set of data simply by averaging the values of the data. The deviation is the second moment and can be calculated using the mean and the average of the squares of the values. Distributions that are almost Gaussian can be described more fully using a few more moments (the third moment is called skew and generally reflects the asymmetry of the distribution). But higher moments are increasingly difficult to compute accurately, and must be avoided or used with care. Without an infinite number of moments, however, it is impossible to describe some important practical cases, such as tails with limits.
Loss of "Gaussianness"
I stated earlier that sum of Gaussian distributions is also Gaussian. Unfortunately, most operations can produce distributions that are not Gaussian even when the the operands are. In the Gaussian approximation, application of a function to an initial Gaussian variable is approximated by transformation by a tangent to the true function. This approximation fails when the function is discontinuous or curved. o