# The Distribution of a Linear Combination

Posted by Beetle B. on Wed 07 June 2017


Let $$X_{1}\dots X_{n}$$ have means $$\mu_{i}$$ and variances $$\sigma_{i}^{2}$$. Then:

1. $$E(\sum_{i}a_{i}X_{i})=\sum_{i}a_{i}\mu_{i}$$ This is true even when they are not independent.
2. If they are independent, $$V(\sum_{i}a_{i}X_{i})=\sum_{i}a_{i}^{2}\sigma_{i}^{2}$$
3. For any $$X_{1}\dots X_{n}$$, $$V(\sum_{i}a_{i}X_{i})=\sum_{i}\sum_{j}a_{i}a_{j}\Cov(X_{i},X_{j})$$

As a special case, $$Y=X_{1}-X_{2}$$. Then $$E(Y)=E(X_{1})-E(X_{2})$$ and $$V_{Y}=V(X_{1})+V(X_{2})$$

## The Case of Normal Random Variables

If $$X_{1},\dots X_{n}$$ are independent normal random variables, then any linear combination is normal (need not be identically distributed).

What if the number $$n$$ is not fixed? Let $$n$$ be a random variable $$N$$. It can be shown that $$E(\sum_{i}X_{i})=\mu E(N)$$