SVD and Eigendecomposition, XGBoost, etc.

What i did

What I learned

Eigen and Singular Value Decomposition

Eigendecomposition is a way of representing a diagonalizable matrix in terms of its eigenvalues and eigenvectors.

Singular value decomposition is essentially a generalized form of the eigenvector/eigenvalue relation. Here, an \(m \times n\) matrix \(A\) is related to two orthonormal matrices and \(V\) and a diagonal matrix \(\Sigma\) such that

\[\underset{n \times m}{\mathrm{A}} = \underset{m\times m}{U} ~~ \underset{m \times n}{\Sigma} ~~ \underset{n\times n}{V^T}\]

SVD allows you to map from a higher dimensional vector space into a lower one (ex. \(\mathbb{R}^4 \rightarrow \mathbb{R}^3\)), where the matrix \(A\) scales vectors in \(V\) to the corresponding vectors in \(U\). Some vectors in \(V\) might go away (i.e. get scaled to 0).

1 Recall that to invert a matrix (i.e. Make \(U \rightarrow U^{-1}\)) you leave the diagonal as is, negate the rest, and divide by the (original) determinant.

What is Gradient Boosting, Really?

2 Note, you don’t want a neural network to do this. Basically the whole point of adding dropout layers is to prevent this sort of specialization in order to make the NN more robust.

What I will do next