Running the following code:
import torch
torch.manual_seed(47)
m = torch.rand([8, 8])
eigvals, eigvecs = torch.linalg.eig(m)
print(m.type(torch.complex64) @ eigvecs[0] - eigvals[0] * eigvecs[0]) # Should be close to →0.
Results with output:
tensor([ 2.2827+0.0995j, -0.9539-0.0442j, -1.9871-0.0204j, -0.6442-0.0372j,
-0.8338-0.0518j, 1.2168-0.0077j, 0.4924-0.5008j, 0.9643+0.4292j])
What is going on here? How is it that the eigenpair test fails?
The documentation of torch.linalg.eig
states that the eigenvectors will be given by the columns of the resulting tensor (eigvecs
in your case), what you got with eigvecs[0]
was the first row.
Using eigvecs.T[0]
or eigvecs[:, 0]
will yield the expected (i.e. close to zero) result:
print(m.type(torch.complex64) @ eigvecs.T[0] - eigvals[0] * eigvecs.T[0])
# >>> tensor([ 3.5763e-07+0.j, 5.9605e-07+0.j, 4.7684e-07+0.j, 2.3842e-07+0.j,
# 3.5763e-07+0.j, 2.3842e-07+0.j, -2.3842e-07+0.j, 1.1921e-07+0.j])