Table of Contents
In this is article we are going to learn about the terms two dimensional random variable, cumulative distribution function, marginal probability and joint density function.
Two Dimensional Random Variable
Let E be an experiment and S a sample space associated with E. Let X = X(s) and Y = Y(s) be two function each assigning a real number to each outcomes s Є S. We call (X, Y) a two dimensional random variable.
If X1 = X1(s), X2 = X2(s), ……., Xn = Xn(s) are n functions each assigning a real number to every outcome s Є S, we call (X1, X2, … , Xn) as n-dimensional random variable.
Discrete and Continuous Random Variable
Let (X, Y) be two-dimensional discrete random variable with possible values of (X, Y) are finite or countably infinite. That is, the possible values of (X, Y) may be represented as (xi, yj), i = 1, 2, …, n, j = 1, 2, …, m. With each possible outcome (xi, yj) we associate a number p(xi, yj) representing P[X = xi , Y = yj ] and satisfying the following conditions:
\[(i)p({{x}_{i}},{{y}_{j}})\ge 0~~\forall x,y\]
\[(ii)\sum\limits_{j=1}^{\infty }{\sum\limits_{i=1}^{\infty }{p({{x}_{i}},{{y}_{j}})=1}}\]
The function p defined for all (xi, yj) in the range space (X, Y) is called the probability function of (X, Y). The set of triplets (xi, yj;p(xi, yj)) i, j = 1, 2, … is called the probability distribution of (X, Y).
Joint Density Function
Let (X, Y) be a continuous random variable assuming all values in some region R of the Euclidian plane. The joint probability density function f(x, y) is a function, satisfying the following conditions:
\[(i)f(x,y)\ge 0~~\forall x,y\]
\[(ii)\iint\limits_{R}{f(x,y)dxdy=1}\]
Cumulative Distribution Function
Let (X, Y) be a two-dimensional random variable. The cumulative distribution function (cdf) F of the two-dimensional random variable (X, Y) is defined by F(x, y) = P[X ≤ x, Y ≤ y]
Marginal and Condition Probability Distribution
With each two dimensional random variable (X, Y) we associate two one dimensional random variable, namely X and Y, individually. That is we may be interested in the probability distribution of X or the probability distribution of Y.
- Let (X, Y) be a discrete random variable with probability distribution p(xi, yj), i, j = 1, 2, … The marginal probability distribution X is defined as \[p({{x}_{i}})=P\left[ X={{x}_{i}} \right]=\sum\limits_{j=1}^{\infty }{p({{x}_{i}},{{y}_{j}})}\] Similarly, the marginal probability distribution of Y is defined as \[p({{y}_{j}})=P\left[ Y={{y}_{j}} \right]=\sum\limits_{i=1}^{\infty }{p({{x}_{i}},{{y}_{j}})}\]
- Let (X, Y) be a two-dimensional discrete random variable with joint pdf f(x, y). The marginal probability density function of X. can be defined as \[g(x)=\int\limits_{-\infty }^{\infty }{f(x,y)dy}\]. The marginal probability density function of Y can be defined as \[h(y)=\int\limits_{-\infty }^{\infty }{f(x,y)dx}\]
Note:
\[P(c\le X\le d)=P(c\le X\le d,-\infty \le Y\le \infty )\]
\[\Rightarrow P(c\le X\le d)=\int\limits_{c}^{d}{\int\limits_{-\infty }^{\infty }{f(x,y)dydx}}\]
\[\therefore P(c\le X\le d)=\int\limits_{c}^{d}{g(x)dx}\]
\[\text{Similarly},~~P(a\le Y\le b)=\int\limits_{a}^{b}{h(y)dy}\]
Definition
Let (X, Y) be a discrete two dimensional random variable with probability distribution p(xi, yj). Let p(xi) and q(yj) be the marginal pdfs of X and Y, respectively.
The conditional pdf of X for given Y = yj is defined by
\[p\left( {{x}_{i}}|{{y}_{j}} \right)=P\left[ X={{x}_{i}}|Y={{y}_{j}} \right]=\frac{p({{x}_{i}},{{y}_{j}})}{q({{y}_{j}})}if~q({{y}_{j}})>0\]
Similarly, the conditional pdf of Y for given X = xi is defined as
\[q\left( {{y}_{j}}|{{x}_{i}} \right)=P\left[ Y={{y}_{j}}|X={{x}_{i}} \right]=\frac{p({{x}_{i}},{{y}_{j}})}{p({{x}_{i}})}if~p({{x}_{i}})>0\]
Definition
Let (X, Y) be a continuous two dimensional random variable with joint pdf ‘f’. Let g and h be the marginal pdfs of X and Y respectively.
The conditional pdf of X for given Y = y is defined by
\[g(x|y)=\frac{f(x,y)}{h(y)},h(y)>0\]
The conditional pdf of Y for given X = x is defined by
\[h(y|x)=\frac{f(x,y)}{g(x)},g(x)>0\]
Independent Random Variable
Let (X, Y) be a two dimensional discrete random variable. We say that X and Y are independent random variables if and only if P(xi, yj) = p(xi)q(yj) for all i and j. That is P(X = xi , Y = yj ) = P(X = xi )P(Y = yj)for all i and j, i.e., p(xi, yj) = p(xi)q(yj) Ɐi,j
Let (X, Y) be a two dimensional continuous random variable. We say that X and Y are independent random variables if and only if f(x, y) = g(x)h(y) for all (x, y), where f is the joint pdf and g and h are the marginal pdfs of X and Y, respectively.
Example 01 |
Find the constant k so that,
is a joint probability density function. Are X and Y independent?
Solution:
We observe that f(x, y) ≥ 0 for all x, y if k ≥ 0
Further,
\[\int\limits_{-\infty }^{\infty }{\int\limits_{-\infty }^{\infty }{f(x,y)dxdy}}=\int\limits_{y=0}^{\infty }{\int\limits_{x=0}^{1}{f(x,y)dxdy}}\]
\[=k\left\{ \int\limits_{0}^{1}{\left( x+1 \right)dx} \right\}\left\{ \int\limits_{0}^{\infty }{{{e}^{-y}}dy} \right\}\]
\[=k{{\left[ \frac{{{x}^{2}}}{2}+x \right]}_{0}}^{1}{{\left[ -{{e}^{-y}} \right]}_{0}}^{\infty }=\frac{3}{2}k\]
Accordingly, f(x, y) is a joint probability density function if k=2/3. With k=2/3, we find that the marginal density functions are
\[g(x)=\int\limits_{-\infty }^{\infty }{f(x,y)dy}=\frac{2}{3}\left( x+1 \right)\int\limits_{0}^{\infty }{{{e}^{-y}}dy}\]
\[\therefore g(x)=\frac{2}{3}\left( x+1 \right),0<x<1\]
\[\text{and}~h(y)=\int\limits_{-\infty }^{\infty }{f(x,y)dy}=\frac{2}{3}{{e}^{-y}}\int\limits_{0}^{1}{\left( x+1 \right)dx}\]
\[\therefore h(y)=\frac{2}{3}{{e}^{-y}}.\frac{3}{2}={{e}^{-y}},y>0\]
We observe that g(x)h(y) = f(x, y)
Therefore, X and Y are independent random variable.
;