Thanks to visit codestin.com
Credit goes to www.scribd.com

0% found this document useful (0 votes)
8 views12 pages

Pca Topic

Principal Component Analysis (PCA) transforms an n-dimensional feature space into an m-dimensional space by extracting new features that are orthogonal. It involves calculating the covariance matrix and identifying eigenvectors corresponding to the highest eigenvalues to determine principal components. PCA has various applications, including data visualization, healthcare analysis, image resizing, and stock data forecasting.

Uploaded by

worded839
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
8 views12 pages

Pca Topic

Principal Component Analysis (PCA) transforms an n-dimensional feature space into an m-dimensional space by extracting new features that are orthogonal. It involves calculating the covariance matrix and identifying eigenvectors corresponding to the highest eigenvalues to determine principal components. PCA has various applications, including data visualization, healthcare analysis, image resizing, and stock data forecasting.

Uploaded by

worded839
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 12

PCA Eigenvalues

Dataset
Compute Eigenvalues and Eigenvectors
Principal Component Analysis
• In PCA, a new set of features are extracted
from the original features which are quite
dissimilar in nature. So, an n-dimensional
feature space gets transformed into an m-
dimensional feature space., where the
dimensions are orthogonal to each other.
Working of PCA:
•First, calculate the covariance matrix of a data set.
•Then, calculate the eigenvectors of the covariance
matrix.
•The eigenvector having the highest eigenvalue
represents the direction in which there is the highest
variance.
•The eigenvector having the next highest eigenvalue
represents the direction in which data has the highest
remaining variance and also orthogonal to the first
direction. So, this helps in identifying the second
principal component.
•Like this, identify the top ‘k’ eigenvectors having top
‘k’ eigenvalues to get the ‘k’ principal components.
Applications of PCA Analysis

• PCA in machine learning is used to visualize


multidimensional data.
• In healthcare data to explore the factors that are
assumed to be very important in increasing the
risk of any chronic disease.
• PCA helps to resize an image.
• PCA is used to analyze stock data and forecasting
data.
• You can also use Principal Component Analysis to
analyze patterns when you are dealing with high-
dimensional data sets.

You might also like