Math/Probabilistic

2.7 Independence

First-Penguin 2024. 3. 15. 14:38
728x90

- independence related to random variables.

 - suitable events involving the possible values of various random variables

Independence of a Random Variable from an Event

'The occurrence of the conditioning event provides no new information on the value of the random variable.'

 

The random variable \(X\) is independent of the event \(A\) if

$$P(X = x and A) = P(X = x)P(A) = p_X(x)P(A), \quad \text{for all }x.$$

-> the two events \({X=x}\) and \(A\) be independent, for any choice \(x\).

 

From the def. of the conditional PMF, we have

$$P(X=x and A) = p_{X|A}(x)P(A).$$

so that as long as \(P(A)>0\), independence is the same as the condition

$$p_{X|A}(x) = p_X(x), \quad \text{for all x}.$$

 

 

key point: \(p_X\) and \(p_{X|A}\) are different -> not independent

 


Independence of Random Variables

We say that two random variables \(X\) and \(Y\) are independent if $$p_{X, Y}(x, y) = p_X(x)p_Y(y), \quad \text{for all x, y}.$$

or we can think that of the two events \({X=x}\) and \({Y=y}\) be independent for every \(x\) and \(y\).

  • $$p_{X, Y}(x, y) = p_{X|Y}(x| y) p_Y(y)$$
  • $$p_{X|Y}(x| y) = p_X(x), \quad \text{for all y with} p_Y(y)\text{ and all x}.$$

 

Conditionally independent

  • $$P(X = x, Y = y |A) = P(X = x| A)P(Y = y | A), \text{for all x and y}$$
  • or $$ p_{X,Y|A}(x, y) = p_X|A(x)p_Y|A(y), \quad \text{for all x and y}.$$
  • this is equivalent to $$p_{X|Y,A}(x| y) = p_{X|A}(x), \quad \text{for all x and y such that }p_{Y|A}(y) > 0.$$

 

Note.

  • Conditional indipendence may not imply unconditional independence and vice versa.
  • If X and Y are independent random variables, then Expectations are also independent.
  • and \(E[g(X)h(Y)] = E[g(X)]E[h(Y)].\) for any functions g and h.
  • The variance of the sum of two independent random variables is equal to the sum of their variances.
    • \(var(X+Y) = var(X)+var(Y).\)

 

 

 

 

 


Summary of Facts About Independent Random Variables

 

 

 


Independence of Several Random Variables

 

Then, How about the Independence of Several Random Variables?

If X, Y. and Z are independent random variables,
then any three random variables of the form f(X). g(Y). and h(Z), are also independent.

 

On the other hand. two random variables of the form g(X, Y) and h(Y, Z) are usually not independent
because they are both affected by Y.

 

 

 


Variance of the Sum of Independent Random Variables

For example)

  • They arise in statistical applications where we "average" a number of independent measurements,
    with the aim of minimizing the effects of measurement errors.
  • They also arise when dealing with the cumulative effect of several independent sources of randomness.

$$var(X1 + X2 + \cdots + Xn) = var(X1) + var(X2) +\cdots+ var(Xn).$$

 

This can be verified by repeated use of the formula var(X + Y) = var(X) +var(Y) for two independent random variables X and Y.

 

 

 

 

 

 


References

Bertsekas, D. P., Tsitsiklis, J. N. (2008). Introduction to Probability Second Edition. Athena Scientific.