(A likelihood function is a conditional probability distribution considered as a function of its second argument, holding the first fixed. For example, consider a model which gives the probability of observables X as a function of a parameter . Then for a specific value of X,
is a likelihood function for .)A widely-used application of the likelihood principle is the method of maximum likelihood.
The likelihood principle is not universally accepted. Some widely-used methods of conventional statistics, for example significance tests, are not consistent with the likelihood principle. Let us briefly consider some of the arguments for and against the likelihood principle.
From a Bayesian point of view, the likelihood principle is a consequence that falls out of Bayes' theorem.
An observation A enters the formula,
Arguments in favor of the likelihood principle
only through the likelihood function, .
In general,
observations come into play through the likelihood function,
and only through the likelihood function;
no other mechanism is needed.