Math/Probabilistic

2.4 Expectation, Mean, and Variance

First-Penguin 2023. 9. 18. 22:54
728x90

Expectation

Def.

We define the expected value (a.k.a. expectation or the mean)
of a random variable X, with PMF \(p_X\), by

$$E[X] = \sum_{x} xp_X(x)$$

 

 

Interpretation

- center of gravity of PMF

- average in large number of repetitions of the expreiment

 

Let X be a r.v. and let \(Y = g(X)\)

$$E[Y] = \sum_{x} g(x)p_X(x)$$

with PMF of original function \(p_X(x)\).

 

Caution

In general, \(E[g(X)] \neq g(E[X])\)

if \(g\) is linear, \(E[g(X)] = g(E[X])\)

 

Properties

  • \(E[\alpha] = \alpha\)
  • \(E[\alpha X] = \alpha E[X]\)
  • \(E[\alpha X + \beta] = \alpha E[X] + \beta\)

Variance, Moments, and the Expected Value Rule

we define the nth moment as \(E[X^n]\), the expected value of the random variable X^n.

With this terminology, the 1st moment of X is just the mean.

 

Variance

Variance is denoted by \(var(X)\) and is defined as the expected value of the random variable \((X - E[X])^2\), i.e.,

$$var(X) = E[(X - E[X])^2].$$

 

Since \((X - E[X])^2\) can only take nonnegative values, the variance is always nonnegative.

 

How to calculate?

calculating the PMF of the random variable \((X - E[X])^2\)

$$\begin{align*}
var(X) &= E[(X - E[X])^2]\\
 &= \sum_{x}(x - E[X])^2 p_X(x)\\
 &= E[X^2] - (E[X])^2
\end{align*}$$

 

Properties:

  • \(var(X) \geq 0\)
  • \(var(\alpha X + \beta ) = \alpha^2 var(X)\)

 

 

Standard deviation

Standard deviation is defined as the square root of the variance and is denoted by \(\sigma_X\):

$$\sigma_X = \sqrt{var(X)}$$

 

 

The standard deviation is often easier to interpret because it has the same units as X.

 

 

 


There is an easier method to calculate \(var(X)\), whice uses the PMF of X but does not require the PMF of \(X - E[X]^2\).

Expected Value Rule for Functions of Random Variables

Let X be a random variable with PMF \(p_X\), and let \(g(X)\) be a function of \(X\). Then, the expected value of the random variable \(g(X)\) is given by 

$$E[g(X)] = \sum_{x} g(x)p_X(x).$$

 

더보기

Proof.

Let \(Y = g(X)\) and use $$p_Y(y) = \sum_{\text{{x | g(x) = y}}}p_X(x)$$

 


Variance

The variance var(X) of a random variable X is defined by

$$var(X) = E [(X - E[X])^2] ,$$

and can be calculated as

$$var(X) = \sum_{x}(x - E[X])^2p_X(x).$$

 

It is always nonnegative.

Its square root is denoted by  \(\sigma_X\) and is called the standard deviation.


Variance in Terms of Moments Expression

$$var(X) = E[X^2] - (E[X])^2$$

 

 


References

Bertsekas, D. P., Tsitsiklis, J. N. (2008). Introduction to Probability Second Edition. Athena Scientific.