PCA Eigenvalues
Dataset
Compute Eigenvalues and Eigenvectors
Principal Component Analysis
• In PCA, a new set of features are extracted
from the original features which are quite
dissimilar in nature. So, an n-dimensional
feature space gets transformed into an m-
dimensional feature space., where the
dimensions are orthogonal to each other.
Working of PCA:
•First, calculate the covariance matrix of a data set.
•Then, calculate the eigenvectors of the covariance
matrix.
•The eigenvector having the highest eigenvalue
represents the direction in which there is the highest
variance.
•The eigenvector having the next highest eigenvalue
represents the direction in which data has the highest
remaining variance and also orthogonal to the first
direction. So, this helps in identifying the second
principal component.
•Like this, identify the top ‘k’ eigenvectors having top
‘k’ eigenvalues to get the ‘k’ principal components.
Applications of PCA Analysis
• PCA in machine learning is used to visualize
multidimensional data.
• In healthcare data to explore the factors that are
assumed to be very important in increasing the
risk of any chronic disease.
• PCA helps to resize an image.
• PCA is used to analyze stock data and forecasting
data.
• You can also use Principal Component Analysis to
analyze patterns when you are dealing with high-
dimensional data sets.