Academic Integrity: tutoring, explanations, and feedback — we don’t complete graded work or submit on a student’s behalf.

Recall the definition of independence: We say that n random variables {X1, X2,..

ID: 2927417 • Letter: R

Question

Recall the definition of independence: We say that n random variables {X1, X2,..., Xn] are (mutually) independent if (and only if) the joint pdf factorizes as a product of marginal distributions. i.e i=1 (On the left x is the n-dim vector of values taken on by each random variable, and on the right r, are the components of x) Defend or refute: (a) Var[X4Y]=Var(X]+Var(Y] if and only if X and Y are independent. (b) If X and Y are independent, then X and sin(Y) are independent. (Intuitively, what is sin(Y)? It's the random variable that takes on sin(y) whenever Y would have taken on y) (c) Suppose X and Y are independent. Let Z-X+Y. Then the pmf of Z is a convolution of marginal pmfs: Pz(xPx(z) Py (z) Vr ER. (Each pmf is a bonafide function mapping R R so it makes sense what is meant by convolution) (d) If X.Y,Z are all mutually independent, then X and Y are mutually independent (e) if X,Y are independent, then EXY-EXIE[Y] (f) Knowing the joint distribution of X,Y may be insufficient to determine if X and Y are independent More information is needed.

Explanation / Answer

a) True

Var(X + Y ) =Var(X) + Var(Y) + 2 Cov(X,Y)

when X and Y are independent Cov(x,Y) = 0

b)

True

as sin Y is purely function of Y and X^7 is only function of X

As X and Y are independent ,

X^7 and sin (Y) are independent as well

c) true

d) true

e)true

Cov(X,Y) = E(XY) - E(X)E(Y) = 0 if X and Y are independent