Factor any real or complex matrix into three matrices:

such that

The result is to represent the action of the matrix as a rotation (or reflection), a scaling, and another scaling (or reflection).

PyTorch lets you do SVD on a higher-rank tensor by treating it as several matrices, though this is not usually done.

SVD has a deep relationship to principal component analysis that I sort of understood in grad school and don’t think is worth (re-)learning now. But both can be used for dimensionality reduction.