Main Page | See live article | Alphabetical index

Variance

This article is about mathematics. See also variance (land use).


In mathematics, the variance of a real-valued random variable is its second central moment, and also its second cumulant (cumulants differ from central moments only at and above degree 4). If μ = E(X) is the expected value of the random variable X, then the variance is

i.e., it is the expected value of the square of the deviation of X from its own mean. It is the mean squared deviation.

We can conclude two things:

One reason for the use of the variance in preference to other measures of dispersion is that the variance of the sum of independent random variables is the sum of their variances. (A weaker condition than independence, called "uncorrelatedness" also suffices.)

For random samples xi where i=1, 2, ..., the variance σ2 is

If X is a vector-valued random variable, with values in Rn, and thought of as a column vector, then the natural generalization of variance is E((X-μ)(X-μ)'), where μ=E(X) and X' is the transpose of X, and so is a row vector. This variance is a nonnegative-definite square matrix, commonly referred to as the covariance matrix.

If X is a complex-valued random variable, then its variance is E((X-μ)(X-μ)*), where X* is the complex conjugate of X. This variance is a nonnegative real number.

When the set of data is a population, we call this the population variance. If the set is a sample, we call it the sample variance. When estimating the population variance of a finite sample, the following formula gives an unbiased estimate:

See algorithms for calculating variance.

See also: standard deviation, arithmetic mean, skewness, kurtosis, statistical dispersion