2.5 Joint PMFs of Multiple Random Variables
Joint PMF
There are two discrete random variables X and Y associated with the same experiment.
Joint Probability Mass Function is a function that completely characterizes the distribution of a discrete random vector.
the probabilities of the values that X and Y -> denoted \(p_{X, Y}\)
Notation.
$$p_{X, Y}(x, y) = P(X = x, Y = y)$$
The abbreviated Notation
$$P({X = x} \cap {Y = y}) \text{ or } P({X = x} \text{ and } {Y = y}) $$
calculate PMFs of X and Y
$$p_X(x) = \sum_{y}p_{X, Y}(x, y)$$
$$p_Y(y) = \sum_{x}p_{X, Y}(x, y)$$
Marginal PMF
We refer to \(p_X\) and \(p_Y\) as the marginal PMFs, to distinguish them from the joint PMF.
calculate Marginal PMF
by using the tabular method
: the joint PMF of X or Y at a given value is obtained by adding the table entries
along a corresponding column (for the marginal PMF of X) or row (for the marginal PMF of Y), respectively.
Functions of Multiple Random Variables
new random variables by considering functions involving several of these random variables.
In particular, a function Z = g(X, Y) of the random variables X and Y defines another random variable.
Its PMF can be calculated from the joint PMF \(p_{X, Y}\) according to $$p_Z(z) = \sum_{{(x, y) | g(x, y) = z}} p_{X, Y}(x, y)$$
about the expected value of the function
$$E[g(X, Y)] = \sum_x \sum_y g(x, y) p_{X, Y}(x, y).$$
When g is linear, (if the form of g is aX + bY + c)
$$E[aX+bY+c] = aE[X] + bE[Y] + c.$$
More than Two Random Variables
The joint PMF of three random variables X, Y and Z is defined as
$$p_{X, Y, Z}(x, y, z) = P(X = x, Y = y, Z = z)$$
for all possible triplets of numerical values \((x, y, z)\).
marginal PMFs
$$p_{X, Y}(x, y) = \sum_{z} p_{X, Y, Z}(x, y, z)$$
$$p_X(x) = \sum_{y} \sum_{z} p_{X, Y, Z}(x, y, z)$$
Expected value (generalized version)
for any random variables \(X_1, X_2, \cdots, X_n\) and any scalars \(a_1, a_2, \cdots , a_n\), we have
$$E[a_1 X_1, a_2 X_2, \cdots, a_n X_n] = a_1 E[X_1], a_2 E[ X_2] + \cdots + a_n E[ X_n] $$
E[x] of binomial random variable
: $$E[x] = \sum_{i=1}^{n} E[X_i] = \sum_{i=1}^n p = np.$$
Summary of Facts About Joint PMFs
Let X and Y be random variables associated with the same experiment.
- The joint PMF \(p_{X, Y}\) of \(X\) and \(Y\) is defined by $$p_{X, Y}(x, y) = P(X = x, Y = y)$$.
- The marginal PMFs of X and Y can be obtained from the joint PMF, using the formula $$p_X(x) = \sum_y p_{X, Y} (x, y), p_Y(y) = \sum_x p_{X, Y} (x, y).$$
- A function \(g(X, Y)\) of \(X\) and \(Y\) defines another random variable, and $$E[g(X, Y)] = \sum_x \sum_y g(x, y) p_{X, Y}(x, y). $$
If \(g\) is linear, of the form \(aX+ bY + c\), we have $$E[aX+bY+c] = aE[X] + bE[Y] + c.$$ - The above have natural extensions to the case where more than two random variables are involved.
References
Bertsekas, D. P., Tsitsiklis, J. N. (2008). Introduction to Probability Second Edition. Athena Scientific.