pythondata-sciencepcasvd

Python: Implement a PCA using SVD


I am trying to figure out the differences between PCA using Singular Value Decomposition as oppossed to PCA using Eigenvector-Decomposition.

Picture the following matrix:

 B = np.array([          [1, 2],
                         [3, 4],
                         [5, 6] ])

When computing the PCA of this matrix B using eigenvector-Decomposition, we follow these steps:

  1. Center the data (entries of B) by substracting the column-mean from each column
  2. Compute the covariance matrix C = Cov(B) = B^T * B / (m -1), where m = # rows of B
  3. Find eigenvectors of C
  4. PCs = X * eigen_vecs

When computing the PCA of matrix B using SVD, we follow these steps:

  1. Compute SVD of B: B = U * Sigma * V.T
  2. PCs = U * Sigma

I have done both for the given matrix.

With eigenvector-Decomposition I obtain this result:

[[-2.82842712  0.        ]
 [ 0.          0.        ]
 [ 2.82842712  0.        ]]

With SVD I obtain this result:

[[-2.18941839  0.45436451]
 [-4.99846626  0.12383458]
 [-7.80751414 -0.20669536]]

The result obtained with eigenvector-Decomposition is the result given as solution. So, why is the result obtained with the SVD different?

I know that: C = Cov(B) = V * (Sigma^2)/(m-1)) * V.T and I have a feeling this might be related to why the two results are different. Still. Can anyone help me understand better?


Solution

  • Please see below a comparision for your matrix with sklearn.decomposition.PCA and numpy.linalg.svd. Can you compare or post how you derived SVD results.

    Code for sklearn.decomposition.PCA:

    from sklearn.decomposition import PCA
    import numpy as np 
    np.set_printoptions(precision=3)
    
    B = np.array([[1.0,2], [3,4], [5,6]])
    
    B1 = B.copy() 
    B1 -= np.mean(B1, axis=0) 
    n_samples = B1.shape[0]
    print("B1 is B after centering:")
    print(B1)
    
    cov_mat = np.cov(B1.T)
    pca = PCA(n_components=2) 
    X = pca.fit_transform(B1)
    print("X")
    print(X)
    
    eigenvecmat =   []
    print("Eigenvectors:")
    for eigenvector in pca.components_:
       if eigenvecmat == []:
            eigenvecmat = eigenvector
       else:
            eigenvecmat = np.vstack((eigenvecmat, eigenvector))
       print(eigenvector)
    print("eigenvector-matrix")
    print(eigenvecmat)
    
    print("CHECK FOR PCA:")
    print("X * eigenvector-matrix (=B1)")
    print(np.dot(PCs, eigenvecmat))
    

    Output for PCA:

    B1 is B after centering:
    [[-2. -2.]
     [ 0.  0.]
     [ 2.  2.]]
    X
    [[-2.828  0.   ]
     [ 0.     0.   ]
     [ 2.828  0.   ]]
    Eigenvectors:
    [0.707 0.707]
    [-0.707  0.707]
    eigenvector-matrix
    [[ 0.707  0.707]
     [-0.707  0.707]]
    CHECK FOR PCA:
    X * eigenvector-matrix (=B1)
    [[-2. -2.]
     [ 0.  0.]
     [ 2.  2.]]
    

    numpy.linalg.svd:

    print("B1 is B after centering:")
    print(B1)
    
    from numpy.linalg import svd 
    U, S, Vt = svd(B1, full_matrices=True)
    
    print("U:")
    print(U)
    print("S used for building Sigma:")
    print(S)
    Sigma = np.zeros((3, 2), dtype=float)
    Sigma[:2, :2] = np.diag(S)
    print("Sigma:")
    print(Sigma)
    print("V already transposed:")
    print(Vt)
    print("CHECK FOR SVD:")
    print("U * Sigma * Vt (=B1)")
    print(np.dot(U, np.dot(Sigma, Vt)))
    

    Output for SVD:

    B1 is B after centering:
    [[-2. -2.]
     [ 0.  0.]
     [ 2.  2.]]
    U:
    [[-0.707  0.     0.707]
     [ 0.     1.     0.   ]
     [ 0.707  0.     0.707]]
    S used for building Sigma:
    [4. 0.]
    Sigma:
    [[4. 0.]
     [0. 0.]
     [0. 0.]]
    V already transposed:
    [[ 0.707  0.707]
     [-0.707  0.707]]
    CHECK FOR SVD:
    U * Sigma * Vt (=B1)
    [[-2. -2.]
     [ 0.  0.]
     [ 2.  2.]]