Expected value

Rolling a dice, the mean converges to the expected value of 3.5

In probability theory and statistics, the expected value of a random variable [math]\displaystyle{ X }[/math] (represented by the symbol [math]\displaystyle{ E(X) }[/math][1]) is the average value the variable will take, that is, assuming that the experiment is repeated an infinite number of times, and the mean (or weighted average) of all the values is calculated along the way.[2]

By definition, the expected value of a discrete random variable [math]\displaystyle{ X }[/math] is calculated by the formula [math]\displaystyle{ \sum_x x P(X=x) }[/math], where [math]\displaystyle{ P(X=x) }[/math] is the probability that [math]\displaystyle{ X }[/math] equals [math]\displaystyle{ x }[/math], and [math]\displaystyle{ x }[/math] ranges over all possible values of [math]\displaystyle{ X }[/math].[3]

The Law of large numbers describes how this happens.

Relationship to weighted average

It is possible to think of the expected value as a weighted average, where the weights, [math]\displaystyle{ w_j, }[/math] are equal to [math]\displaystyle{ P(X=x_j) }[/math]. The sum over all probabilities,[math]\displaystyle{ \sum_j P(X=x_j), }[/math] equals one (1). This allows us to write the weighted average as:

[math]\displaystyle{ \frac{\sum_j w_j x_j}{\sum_j w_j}=\sum_x x P(X=x)=E(X) }[/math]

Expected Value Media

Related pages

References

  1. "List of Probability and Statistics Symbols". Math Vault. 2020-04-26. Retrieved 2020-08-21.
  2. "Expected Value - easily explained! | Data Basecamp". 2021-11-26. Retrieved 2022-07-15.
  3. "Expected Value | Brilliant Math & Science Wiki". brilliant.org. Retrieved 2020-08-21.