transformation of random variables

The goal is to find the distribution of the transformed random variable or joint distribution

of the random vector.

Theorem 8.1.1 (Change of variables in one dimension). Let X be a continuous r.v. with PDF \(f_X\), and let Y = g(X), where g is differentiable and strictly increasing (or strictly decreasing). Then the PDF of Y is given by: \[ f_Y(y) = f_X(x)\left| \dfrac{dx}{dy} \right|\\ \] proof: \[ \begin{align} F_Y(y) &= P(Y ≤ y)\\ &= P(g(X) ≤ y) \\ &= P(X ≤ g^{-1}(y)) \\ &= F_X(g^{-1}(y)) \\ &= F_X(x) && (y = g(x)) \\ \end{align} \]

So we can have the formula: \(f_Y(y) = \dfrac{dF_Y(y)}{dy} = \dfrac{dF_X(x)}{dy} = \dfrac{dF_X(x)}{dx}\dfrac{dx}{dy} = f_X(x)\left| \dfrac{dx}{dy} \right|\)

Theorem 8.1.5 (Change of variables). Let X = (X1, . . . ,Xn) be a continuous random vector with joint PDF X, and let Y = g(X) where g is an invertible function from \(R^n\) to \(R^n\). (The domain of g does not actually have to be all of \(R^n,\) but it does need to be large enough to contain the support of X, else g(X) could be undefined!) \[ \begin{align} f_Y(y) = f_X(x)\left|\dfrac{\partial x}{\partial y}\right| \end{align} \] The \(\dfrac{\partial x}{\partial y}\) is the Jacobian matrix.

The Box-Muller example: Let U ~ Unif(0, \(2\pi\)), and let T ~ Exp(1) be independent of U. Define X = \(\sqrt{2T}cosU\) and Y = \(\sqrt{2T}sinU\). Find the joint PDF of (X, Y ). Are they independent? What are their marginal distributions? \[ \begin{align} f_{X,Y}(x, y) &= f_{T,U}(t,u) \left| \begin{array}{cc} \frac{\partial t}{\partial x} & \frac{\partial t}{\partial y} \\ \frac{\partial u}{\partial x} & \frac{\partial u}{\partial y} \end{array} \right|\\ &= 1 \cdot e^{-t} \frac{1}{2\pi} \cdot 1 \\ &= \frac{e^{\frac{x^2 + y^2}{-2}}}{2\pi} \\ &= \frac{1}{\sqrt{2\pi}}e^{\frac{x^2}{-2}} \cdot \frac{1}{\sqrt{2\pi}}e^{\frac{y^2}{-2}} \end{align} \]