I am trying to learn python, and I have run into something confusing to me. Just as a sanity check, I wanted to make sure I could reconstruct a graph laplacian matrix from its eigenvectors and eigenvalues. In R this works as expected, but not in python. The python reconstructed matrix is pretty far off - the norm(Laplacian - estimate) ~ 0.99, while in R it is ~1e-16. I was hoping that someone could explain to me what is going on here. I've posted the code for both languages below:
In R:
library(igraph)
g <- watts.strogatz.game(1, 20, 3, 0, loops = FALSE, multiple = FALSE)
A <- as.matrix(as_adjacency_matrix(g, type = c("both"),
attr = NULL, edges = FALSE, names = TRUE,
sparse = FALSE))
A <- -A
diag(A) <- abs(rowSums(A))
D <- diag(diag(A)^-0.5, dmn[1])
Ln <- D %*% A %*% D
eL <- eigen(Ln)
rL <- eL$vectors %*% diag(eL$values) %*% t(eL$vectors)
print(norm(Ln - rL))
In Python:
import networkx as nx
import numpy as np
n=20
G = nx.watts_strogatz_graph(n, 3, 0)
L = nx.normalized_laplacian_matrix(G).toarray()
evals, evecs = np.linalg.eig(L)
idx = evals.argsort()[::-1]
evals = evals[idx]
F = evecs[:,idx]
D = np.diag(evals)
FDF = np.linalg.multi_dot([F, D, F.T])
rec = np.linalg.norm(L - FDF)
print(rec)
Thank you!
Paul
This does not really answer the question, but using np.linalg.svd
instead of np.linalg.eig
does seem to work fine.
edit actually, now I understand. numpy
has linalg.eig
which does not assume that the matrix is symmetric (or hermitian) and linalg.eigh
(which does). Use the second, and all is well.