Bayesian inference

Bayesian inference (/ˈbziən/ bay-ZEE-ən or /ˈbʒən/ BAY-zhən)[1] is a type of statistical inference. In Bayesian inference, evidence or information is available, Bayes' theorem is used to change (or update) the probability of a hypothesis. Bayesian inference uses a prior distribution to estimate posterior probabilities. Bayesian inference is an important to statistics, mathematical statistics, decision theory, and sequential analysis. Bayesian inference is used in science, engineering, philosophy, medicine, sport, and law.

Bayes' rule

 
A geometric visualisation of Bayes' theorem. In the table, the values 2, 3, 6 and 9 give the relative weights of each corresponding condition and case. The figures denote the cells of the table involved in each metric, the probability being the fraction of each figure that is shaded. This shows that P(A|B) P(B) = P(B|A) P(A) i.e. P(A|B) = P(B|A) P(A)/P(B) . Similar reasoning can be used to show that P(¬A|B) = P(B|¬A) P(¬A)/P(B) etc.
Contingency table
Template:Diagonal split header Satisfies
hypothesis
H
Violates
hypothesis
¬H

Total
Has evidence
E
P(H|E)·P(E)
= P(E|H)·P(H)
P(¬H|E)·P(E)
= P(E|¬H)·P(¬H)
P(E)
No evidence
¬E
P(H|¬E)·P(¬E)
= P(¬E|H)·P(H)
P(¬H|¬E)·P(¬E)
= P(¬E|¬H)·P(¬H)
P(¬E) =
1−P(E)
Total    P(H) P(¬H) = 1−P(H) 1

Bayesian inference figures out the posterior probability from prior probability and the "likelihood function". The likelihood function comes from a statistical model of the data. [math]\displaystyle{ P(H \mid E) = \frac{P(E \mid H) \cdot P(H)}{P(E)}, }[/math] where

  • [math]\displaystyle{ H }[/math] is a hypothesis that is changed by data (or evidence). There are usually many hypotheses. The point of the test is to see which hypothesis is more likely.
  • [math]\displaystyle{ P(H) }[/math] is the prior probability. It estimates the probability of a hypothesis before there is any evidence.
  • [math]\displaystyle{ E }[/math] is the evidence, or data. It is any new data that is found.
  • [math]\displaystyle{ P(H \mid E) }[/math] is the posterior probability. This is what we want to know.
  • [math]\displaystyle{ P(E \mid H) }[/math] is the likelihood function.
  • [math]\displaystyle{ P(E) }[/math] is the marginal likelihood. It is the same for all possible hypotheses that are being tested. [math]\displaystyle{ P(E) }[/math] has to be greater than 0. If [math]\displaystyle{ P(E) }[/math] is 0, then you divide by zero.

See also

Further reading

  • Vallverdu, Jordi (2016). Bayesians Versus Frequentists A Philosophical Debate on Statistical Reasoning. New York: Springer. ISBN 978-3-662-48638-2.
  • Clayton, Aubrey (August 2021). Bernoulli's Fallacy: Statistical Illogic and the Crisis of Modern Science. Columbia University Press. ISBN 978-0-231-55335-3.

Bayesian Inference Media

References

Other websites