Definition:
PDF for Jointly distributed random variable
X and Y are jointly continous for all continuous random variables, having the property that for every set C of pairs, if there exists a joint probability density function f ( x , y ) that P {( X , Y ) ∈ C } = ∫ ∫ ( x , y ∈ C ) f ( x , y ) d x d y
Where f ( x , y ) is the probability of selecting both of them
Defining C = {( x , y ) : x ∈ A , y ∈ B }
Then P { X ∈ A , Y ∈ B } = ∫ B ∫ A f ( x , y ) d x d y
Equivalently F ( a , b ) = P { X ∈ ( − ∞ , a ] , Y ∈ ( − ∞ , b ]} = ∫ − ∞ b ∫ − ∞ a f ( x , y ) d x d y
f ( a , b ) = ∂ a ∂ b ∂ 2 F ( a , b )
If X and Y are jointly continuous, they are individually continuous, and their PDF is:
f X ( x ) = ∫ − ∞ ∞ f ( x , y ) d y
therefore. expectation of 1 variable is E [ X ] = ∫ x . f X ( x ) d x = ∫ ∫ − ∞ ∞ x . f ( x , y ) d y d x
f Y ( y ) = ∫ − ∞ ∞ f ( x , y ) d x
P ( X ∈ A ) = P { X ∈ A , Y ∈ ( − ∞ , ∞ )} = ∫ A ∫ − ∞ ∞ f ( x , y ) d y d x = ∫ A f X ( x ) d x
For number of jointly random variable, n > 2
We can also define joint probability distributions for n random variables in exactly the same manner as we did for n = 2
Independent random variables
If the random variable X and Y for any two sets of real numbers A and B P { X ∈ A , Y ∈ B } = P { X ∈ A } . P { Y ∈ B }
P { X ≤ A , Y ≤ B } = P { X ≤ a } . P { Y ≤ b }
F ( a , b ) = F X ( a ) . F Y ( b ) for all a , b
2 continuous random variables are independent if and only if their JPDF can be expressed as: f X , Y = h ( x ) . g ( y ) − ∞ < x , y < ∞
Expectation of function of joint continuous variable: