\(\newcommand{\Cov}{\mathrm{Cov}}\) \(\newcommand{\Corr}{\mathrm{Corr}}\)

Let \(X_{1}\dots X_{n}\) have means \(\mu_{i}\) and variances \(\sigma_{i}^{2}\). Then:

- \(E(\sum_{i}a_{i}X_{i})=\sum_{i}a_{i}\mu_{i}\) This is true even when they are not independent.
- If they are independent, \(V(\sum_{i}a_{i}X_{i})=\sum_{i}a_{i}^{2}\sigma_{i}^{2}\)
- For any \(X_{1}\dots X_{n}\), \(V(\sum_{i}a_{i}X_{i})=\sum_{i}\sum_{j}a_{i}a_{j}\Cov(X_{i},X_{j})\)

As a special case, \(Y=X_{1}-X_{2}\). Then \(E(Y)=E(X_{1})-E(X_{2})\) and \(V_{Y}=V(X_{1})+V(X_{2})\)

## The Case of Normal Random Variables

If \(X_{1},\dots X_{n}\) are independent normal random variables, then any linear combination is normal (need not be identically distributed).

What if the *number* \(n\) is not fixed? Let \(n\) be a random
variable \(N\). It can be shown that
\(E(\sum_{i}X_{i})=\mu E(N)\)