mathlinear-algebrasvd

Fast computation of Singular Value Decomposition (SVD) for a matrix that undergoes continuous changes while maintaining the same dimensions


Is there a way to efficiently update the Singular Value Decomposition (SVD) of a time-varying matrix, A(t), at time t + delta(t) if we already know the SVD of A(t)? Assume the change in the matrix between the two time steps is small.

Instead of recomputing the SVD entirely at each time step, I'm looking for an approach that efficiently updates the existing SVD based on the small changes in the matrix.


Solution

  • No. You can think of SVD as a rotation along the axis with the most variance of the data. You might think that a small delta in A would only slightly impact the variance, and slightly impact the decomposition. But some small changes could yield a completely different set of vectors. They would still span a very similar space, but the vectors would be quite different, such that the "5th eigenvector" we very very different from the one at A(t).

    It seems like you might be trying to save on compute, in a realtime control process. Have you considered holding the eigenvectors constant, or updating them ever 100th time step, and simply updating a projection of the data A(t+1) onto the previous basis set? Not sure if that's relevant for your application. Good luck!