## Probability

Given two random variables \(X,Y\), the joint pdf is given by \(p(x,y)=P(X=x,Y=y)\).

Let \(A\) be an event. Then the joint pmf is:

The **marginal** pdf of \(X\) is denoted by \(p_{X}(x)\):

The joint density function is:

The marginal pdf is:

## Independence

Two random variables \(X\) and \(Y\) are **independent** if
\(\forall(x,y),p(x,y)=p_{X}(x)p_{Y}(y)\)

I think the author makes the claim that to be independent, \(f(x,y)\) must be of the form \(g(x)k(y)\) and the region of positive density must be a rectangle aligned with the axes.

Multiple random variables are independent if they are independent for all subsets of \(X_{1},\dots,X_{n}\).

## Conditional Probability

Let \(X,Y\) be two random variables. The conditional pdf of \(Y\) given \(X=x\) is: