Markov's inequality (and other similar inequalities) relate probabilities to expectationss, and provide (frequently) loose but still useful bounds for the distribution function of a random variable.
Table of contents |
2 A Generalisation |
Markov's inequality states that if X is a random variable and a is some positive constant, then
Markov's inequality is actually just one of a wider class of inequalities relating probabilities and expectations, that are all examples of a single theorem.
Let X be a random variable and a be some positive constant (a > 0). If
Let A be the set {x: h(x) ≥ a}, and let IA(x) be the indicator function of A. (That is, IA(x) = 1 if x ∈ A, and is 0 otherwise.) Then,
Definition
A Generalisation
Theorem
then
Proof
The theorem follows by taking the expectation of both sides of this equation, and observing that
Examples