I've been going through some tutorial on factorization in Julia. For the sake of practice, I am trying to take eigendecompositions from a matrix and recreate the original Matrix, using the formula:
A = VλV⁻¹
Where V is a matrix of eigenvectors, λ is a diagonal matrix of eigenvalues, and V⁻¹ is the inverted matrix V.
What confuses me is that the eigenvalues are returned as a vector, while the the guides I found states it should be returned as a diagonal matrix.
Code example:
using LinearAlgebra
# Create matrix
A = rand(3, 3)
# Eigendecomposition
AEig = eigen(A)
λ = AEig.values
3-element Vector{Float64}:
V = AEig.vectors
3×3 Matrix{Float64}:
Acomp = V*λ*inv(V)
A ≈ Acomp
Trying to multiply the vector and matrices returns an error:
DimensionMismatch("A has dimensions (3,1) but B has dimensions (3,3)")
This occurs because multiplying V with λ returns a 3-element vector, which is then attempted multiplied with V⁻¹, which is a 3×3 Matrix. My question is, is there a straightforward way to create a diagonal matrix from a vector? Alternatively, can "recomposition" of the original matrix be achieved another way?
You can use identity matrix from LinearAlgebra
presented as I
like this:
julia> λ
3-element Vector{Float64}:
-0.4445656542213612
0.5573883013610712
1.310095519651262
julia> λ .* I(3)
3×3 Matrix{Float64}:
-0.444566 -0.0 -0.0
0.0 0.557388 0.0
0.0 0.0 1.3101
The .*
there means that every element of the vector is being multiplied by the corresponding row of the matrix.
[Edit:] I discovered another way to create a diagonal matrix, after posting the question using the Diagonal()
function. While the above solution works, this creates somewhat simpler syntax:
julia> Diagonal(λ)
3×3 Diagonal{Float64, Vector{Float64}}:
-0.444566 ⋅ ⋅
⋅ 0.557388 ⋅
⋅ ⋅ 1.3101