I want to find the eigenvalues of a square matrix. I try 2 different approaches.
import numpy as np
D = np.random.random((500,500))
eigenvalues, eigenvectors = np.linalg.eig(D)
%timeit np.linalg.eigvals(D)
129 ms ± 7 ms per loop (mean ± std. dev. of 7 runs, 10 loops each)
%timeit eigenvectors@D@np.linalg.inv(eigenvectors)
42.2 ms ± 4.54 ms per loop (mean ± std. dev. of 7 runs, 10 loops each)
Why is the second approach faster? Wasn't the eigvals
algorithm supposed to be faster because it is well optimized? What do I miss? Also hypothetically we know the eigenvector and we do not need to calculate it.
But those 2 codes do totally different things.
The 1st one (the 1st you timeit) just computes eigenvalues, without using the fact that you already computed them 1 line earlier.
The 2nd one does not really compute eigen values, but computes the diagonal matrix (but fails to do so, because you inverted the computation order. np.linalg.inv(eigenvectors) @ D @ eigenvectors
is the diagonal matrix (whose diagonal is indeed made of eigen values). You probably saw this code in an example where D
is the diagonal matrix (as its name suggests) and then your code computes the original matrix back.
And that computation assumes that eigenvectors are known.
So, even if the 2nd one was a way to compute eigenvalues (a convoluted way, I think, to compute a whole diagonal matrix to get just the values), it would be a way that requires that you compute the eigen vectors first (and to do that, you probably need to compute eigen values too; at least I know no efficient algorithm that computes eigenvectors without also computing eigenvalues. That is why numpy has a eigvals
method, that computes only eigenvalues, and a eig
method that computes both eigen values and vectors, but no eigvec
method, that would compute only eigen vectors: that would be pointless; why having a method that does less and cost as much as eig
?
So, another way to put is, is that if you want to compare the timing, you should add the the 2nd timing what it cost you to compute eigenvectors
. On my PC (apparently approx twice slower than yours), I need 277 ms to compute eigvals
, and 120 ms to compute the diagonal matrix from eigenvectors
to which you must add the 360 ms I needed to compute eigenvectors
in the first place.
So in total, it is 277 ms for eigvals
vs 480 ms for eig
+P@D@P⁻¹