Join the conversation

SVD: Singular Value Decomposition:
a linear algebraic method used to decompose a matrix into 3 matrices, 2 of them are orthogonal matrices.Orthogonal unit vector: A vector that satisfies both conditions: being orthogonal to another vector (or vectors) and having a magnitude of '1'.Other ways to find the PC in PCA are:1. Eigenvalue Decomposition (EVD) of the Covariance Matrix
2. Gradient Descent (Optimization-Based Methods)
3. Power Iteration for the Leading Principal Component
4. Matrix Factorization (NMF, Factor Analysis, etc.)
5. Kernel PCA for Nonlinear Data** Tools and Libraries for PCA
Python:
1. numpy.linalg.eig (Eigenvalue Decomposition)
2. scipy.linalg.svd (SVD)
3. sklearn.decomposition.PCA
Reply

The purpose of using PCA or SVD is to decompose the original big Matrix/Dataset into small matrices, and the process is called " Dimensionality Reduction".We do so because when we have many dimensions/features/columns/vectors/parameters, we can only plot 3 dimensions.

Here are the key matrices and steps used in PCA:1. Data Matrix
2. Covariance Matrix
3. Principal Components Matrix
4. Transformed Data Matrix
Reply

Done
Reply