Mathematics for Machine Learning
CVPR 2021 Tutorial: Normalizing Flows and Invertible Neural Networks in Computer Vision
Change-of-variables trick: Build complex distributions from simple distributions via a flow of successive (invertible) transformations
Key Idea: Transform random variable $X$ into random variable $Z$ using an invertible transformation $\phi$, while keeping track of the change in distribution
$$ \left|\det(\frac{dz}{dx})\right|=\left|\det(\frac{d\phi(x)}{dx})\right| $$
Determinant of Jacobian tells us how much the domain $dx$ is stretched to $dz$
Rescale $p_Z$ by the inverse of Jacobian determinant
$$ p_Z(\mathbf z)=p_X(\mathbf x)=p_X(\mathbf x)\left|\det\frac{d\phi(\mathbf x)}{d\mathbf x})\right|^{-1} $$