pythonjuliasparse-matrixgaussian-process

Complexity of Sparse Matrix Cholesky decomposition


I am having trouble finding a straightforward answer to the following question:

If you compute the Cholesky decomposition of an nxn positive definite symmetric matrix A, i.e factor A=LL^T with L a lower triangular matrix, the complexity is O(n^3). For sparse matrices, there are apparently faster algorithms, but how much faster?

What complexity can we achieve for such a matrix with say m<n^2 nonzero entries?

Edit: my matrix is also approximately main diagonal (only the diagonal and a some adjacent diagonals below and above are nonzero).

P.S I am eventually interested in implementations in either Julia or Python. Python has the sksparse.cholmod module (https://scikit-sparse.readthedocs.io/en/latest/cholmod.html) but it isn't clear to me what algorithm they are using and what its complexity is. Not sure about Julia, if anyone can tell me.


Solution

  • This can only be answered exactly for abitrary matrices if P=NP ... so it's not possible to answer in general. The time complexity depends on the fill-reducing ordering used, which is attempting to get an approximate solution to an NP hard problem.

    However, for the very special case of a matrix coming from a regular square 2D or 3D mesh, there is an answer. In this case, nested dissection gives an ordering that is asymptotically optimal. For a 2D s-by-s mesh, the matrix has dimension n = s^2 and I think about 5n entries. In this case, L has 31*(n log2(n)/8)+O(n) nonzeros, and the work is 829*(n^(3/2))/84+O(n log n). For a 3D s-by-s-by-s mesh with n = s^3, there are O(n^(4/3)) nonzeros in L and O(n^2) operations are required to compute L.