Answer:
(a)[tex]E[X+Y]=E[X]+E[Y][/tex]
(b)[tex]Var(X+Y)=Var(X)+Var(Y)[/tex]
Step-by-step explanation:
Let X and Y be discrete random variables and E(X) and Var(X) are the Expected Values and Variance of X respectively.
(a)We want to show that E[X + Y ] = E[X] + E[Y ].
When we have two random variables instead of one, we consider their joint distribution function.
For a function f(X,Y) of discrete variables X and Y, we can define
[tex]E[f(X,Y)]=\sum_{x,y}f(x,y)\cdot P(X=x, Y=y).[/tex]
Since f(X,Y)=X+Y
[tex]E[X+Y]=\sum_{x,y}(x+y)P(X=x,Y=y)\\=\sum_{x,y}xP(X=x,Y=y)+\sum_{x,y}yP(X=x,Y=y).[/tex]
Let us look at the first of these sums.
[tex]\sum_{x,y}xP(X=x,Y=y)\\=\sum_{x}x\sum_{y}P(X=x,Y=y)\\\text{Taking Marginal distribution of x}\\=\sum_{x}xP(X=x)=E[X].[/tex]
Similarly,
[tex]\sum_{x,y}yP(X=x,Y=y)\\=\sum_{y}y\sum_{x}P(X=x,Y=y)\\\text{Taking Marginal distribution of y}\\=\sum_{y}yP(Y=y)=E[Y].[/tex]
Combining these two gives the formula:
[tex]\sum_{x,y}xP(X=x,Y=y)+\sum_{x,y}yP(X=x,Y=y) =E(X)+E(Y)[/tex]
Therefore:
[tex]E[X+Y]=E[X]+E[Y] \text{ as required.}[/tex]
(b)We want to show that if X and Y are independent random variables, then:
[tex]Var(X+Y)=Var(X)+Var(Y)[/tex]
By definition of Variance, we have that:
[tex]Var(X+Y)=E(X+Y-E[X+Y]^2)[/tex]
[tex]=E[(X-\mu_X +Y- \mu_Y)^2]\\=E[(X-\mu_X)^2 +(Y- \mu_Y)^2+2(X-\mu_X)(Y- \mu_Y)]\\$Since we have shown that expectation is linear$\\=E(X-\mu_X)^2 +E(Y- \mu_Y)^2+2E(X-\mu_X)(Y- \mu_Y)]\\=E[(X-E(X)]^2 +E[Y- E(Y)]^2+2Cov (X,Y)[/tex]
Since X and Y are independent, Cov(X,Y)=0
[tex]=Var(X)+Var(Y)[/tex]
Therefore as required:
[tex]Var(X+Y)=Var(X)+Var(Y)[/tex]