Expected value

The expected value E[X] of a random variable X is the weighted average of the values x_i that the variable can take, multiplied by their probabilities of occurring: $$ E[X] = \sum_i x_i \cdot p(x_i) $$

The expected value might or might not be one of the possible outcomes of the random variable.

As a weighted average, the expected value is the number the phenomenon tends toward after many repetitions.

    A practical example

    Rolling a die leads to six possible outcomes.

    $$ X = \ { \ 1 \ , \ 2 \ , \ 3 \ , \ 4 \ , \ 5 \ , \ 6 \ } $$

    Each outcome has the same probability.

    $$ p(1)=p(2)=p(3)=p(4)=p(5)=p(6)= \frac{1}{6} $$

    The expected value of the random variable is E(X)=3.5.

    $$ E[X] = 1 \cdot \frac{1}{6} + 2 \cdot \frac{1}{6} + 3 \cdot \frac{1}{6} + 4 \cdot \frac{1}{6} + 5 \cdot \frac{1}{6} + 6 \cdot \frac{1}{6} = 3.5 $$

    Note: In this case, the expected value E(X)=3.5 is not one of the outcomes of the random variable X={1,2,3,4,5,6}. It's a limiting value that the average of the results will approach after many rolls of the die.

    And so on.

     

     
     

    Please feel free to point out any errors or typos, or share suggestions to improve these notes. English isn't my first language, so if you notice any mistakes, let me know, and I'll be sure to fix them.

    FacebookTwitterLinkedinLinkedin
    knowledge base

    Calculating Probability