Main Page | See live article | Alphabetical index

Dominated convergence theorem

In mathematics, Henri Lebesgue's dominated convergence theorem states that if a sequence { fn : n = 1, 2, 3, ... } of real-valued measurable functions on a measure space S converges almost everywhere, and is "dominated" (explained below) by some measurable function g whose integral is finite, then

To say that the sequence is "dominated" by g means that

for every n and "almost every" x (i.e., the measure of the set of exceptional values of x is zero). The theorem assumes that g is "integrable", i.e.,

Given the inequalities above, the absolute value sign enclosing g may be dispensed with. In Lebesgue's theory of integration, one calls a function "integrable" precisely if the integral of its absolute value is finite, so this hypothesis is expressed by saying that the dominating function is integrable.

That the assumption that the integral of g is finite cannot be dispensed with may be seen as follows: let f(x) = n if 0 < x < 1/n and f(x) = 0 otherwise. In that case,