NAME: K.
ABINASH
CLASS: CSE-D
SUBJECT: LINEAR ALGEBRA
REG NO: AUU23EGCSE118
LITERATURE SURVEY
The project on the application of Eigenvalues and Eigenvectors involves a deep dive into a
fundamental concept of linear algebra, with wide-ranging applications across various fields,
including engineering, physics, computer science, and economics. Eigenvalues and
Eigenvectors are essential tools in analyzing and understanding the behavior of linear systems.
Eigenvectors from eigenvalues: A survey of a basic identity in linear algebra
Peter B. Denton, Stephen J. Parke, Terence Tao, Xining Zhang
If A is an n×n Hermitian matrix with eigenvalues λ1(A),…,λn(A) and i,j=1,…,n, then the jth
component vi,j of a unit eigenvector vi associated to the eigenvalue λi(A) is related to the
eigenvalues λ1(Mj),…,λn−1(Mj) of the minor Mj of A formed by removing the jth row and
column by the formula
|vi,j|2∏k=1;k≠in(λi(A)−λk(A))=∏k=1n−1(λi(A)−λk(Mj)).
We refer to this identity as the \emph{eigenvector-eigenvalue identity} and show how this
identity can also be used to extract the relative phases between the components of any given
eigenvector. Despite the simple nature of this identity and the extremely mature state of
development of linear algebra, this identity was not widely known until very recently. In this
survey we describe the many times that this identity, or variants thereof, have been discovered
and rediscovered in the literature (with the earliest precursor we know of appearing in 1834).
We also provide a number of proofs and generalizations of the identity.
What is eigen vectors and eigen values?
characteristic root" redirects here. For the root of a characteristic equation, see Characteristic
equation (calculus).
In linear algebra, it is often important to know which vectors have their directions unchanged
by a given linear transformation. An eigenvector (/ˈaɪɡən-/ EYE-gən-) or characteristic
vector is such a vector. Thus an eigenvector of a linear transformation is scaled by
a constant factor when the linear transformation is applied to it: . The
corresponding eigenvalue, characteristic value, or characteristic root is the multiplying
factor .
Geometrically, vectors are multi-dimensional quantities with magnitude and direction, often
pictured as arrows. A linear transformation rotates, stretches, or shears the vectors upon
which it acts. Its eigenvectors are those vectors that are only stretched, with no rotation or
shear. The corresponding eigenvalue is the factor by which an eigenvector is stretched or
squished. If the eigenvalue is negative, the eigenvector's direction is reversed.[1]
The eigenvectors and eigenvalues of a transformation serve to characterize it, and so they
play important roles in all the areas where linear algebra is applied, from geology to quantum
mechanics. In particular, it is often the case that a system is represented by a linear
transformation whose outputs are fed as inputs to the same inputs (feedback). In such an
application, the largest eigenvalue is of particular importance, because it governs the long-
term behavior of the system, after many applications of the linear transformation, and the
associated eigenvector is the steady state of the system.
Definition
Consider a matrix A and a nonzero vector . If applying A to (denoted by )
simply scales by a factor of λ, where λ is a scalar, then is an eigenvector of A,
and λ is the corresponding eigenvalue. This relationship can be expressed as:
There is a direct correspondence between n-by-n square matrices and linear transformations
from an n-dimensional vector space into itself, given any basis of the vector space. Hence, in
a finite-dimensional vector space, it is equivalent to define eigenvalues and eigenvectors
using either the language of matrices, or the language of linear transformations.
If V is finite-dimensional, the above equation is equivalent to
where A is the matrix representation of T and u is the coordinate vector of v.
Overview
Eigenvalues and eigenvectors feature prominently in the analysis of linear transformations.
The prefix eigen- is adopted from the German word eigen (cognate with
the English word own) for 'proper', 'characteristic', 'own'.[6][7] Originally used to
study principal axes of the rotational motion of rigid bodies, eigenvalues and eigenvectors
have a wide range of applications, for example in stability analysis, vibration analysis, atomic
orbitals, facial recognition, and matrix diagonalization.
In essence, an eigenvector v of a linear transformation T is a nonzero vector that, when T is
applied to it, does not change direction. Applying T to the eigenvector only scales the
eigenvector by the scalar value λ, called an eigenvalue. This condition can be written as the
equation
referred to as the eigenvalue equation or eigenequation. In general, λ may be any scalar. For
example, λ may be negative, in which case the eigenvector reverses direction as part of the
scaling, or it may be zero or complex.
The example here, based on the Mona Lisa, provides a simple illustration. Each point on the
painting can be represented as a vector pointing from the center of the painting to that point.
The linear transformation in this example is called a shear mapping. Points in the top half are
moved to the right, and points in the bottom half are moved to the left, proportional to how
far they are from the horizontal axis that goes through the middle of the painting. The vectors
pointing to each point in the original image are therefore tilted right or left, and made longer
or shorter by the transformation. Points along the horizontal axis do not move at all when this
transformation is applied. Therefore, any vector that points directly to the right or left with no
vertical component is an eigenvector of this transformation, because the mapping does not
change its direction. Moreover, these eigenvectors all have an eigenvalue equal to one,
because the mapping does not change their length either.
Linear transformations can take many different forms, mapping vectors in a variety of vector
spaces, so the eigenvectors can also take many forms. For example, the linear transformation
could be a differential operator like , in which case the eigenvectors are functions
called eigenfunctions that are scaled by that differential operator, such as
Alternatively, the linear transformation could take the form of an n by n matrix, in which case
the eigenvectors are n by 1 matrices. If the linear transformation is expressed in the form of
an n by n matrix A, then the eigenvalue equation for a linear transformation above can be
rewritten as the matrix multiplication
where the eigenvector v is an n by 1 matrix. For a matrix, eigenvalues and eigenvectors can
be used to decompose the matrix—for example by diagonalizing it.
Eigenvalues and eigenvectors give rise to many closely related mathematical concepts, and
the prefix eigen- is applied liberally when naming them:
• The set of all eigenvectors of a linear transformation, each paired with its
corresponding eigenvalue, is called the eigensystem of that transformation.
• The set of all eigenvectors of T corresponding to the same eigenvalue, together
with the zero vector, is called an eigenspace, or the characteristic
space of T associated with that eigenvalue.
• If a set of eigenvectors of T forms a basis of the domain of T, then this basis is
called an eigenbasis.
Eigenvalues represent scalar values that quantify how a linear transformation (represented by
a matrix) affects the corresponding Eigenvectors. These concepts find extensive use in diverse
areas, such as structural engineering for analyzing vibrations, quantum mechanics for
describing the behavior of quantum systems, and image processing for compression and
filtering.
One of the key applications of Eigenvalues and Eigenvectors is in solving systems of linear
differential equations. By converting a system of differential equations into a matrix form, one
can use Eigenvalues and Eigenvectors to find solutions, which is particularly useful in
modeling physical systems governed by differential equations, such as mechanical vibrations
or electrical circuits.
Another significant application is in principal component analysis (PCA) in statistics and data
analysis. PCA uses Eigenvectors to reduce the dimensionality of data while preserving its
variance, enabling more efficient analysis and visualization of complex datasets.
Eigenvalues and Eigenvectors also play a crucial role in the field of quantum mechanics, where
they are used to represent the states of quantum systems and to calculate observable quantities
such as energy levels and probabilities of measurement outcomes.
In conclusion, the study and application of Eigenvalues and Eigenvectors are essential in
various fields, offering powerful tools for analyzing and solving complex problems.
Understanding these concepts not only provides insights into the behavior of linear systems but
also opens up avenues for innovation and advancement in diverse areas of science and
engineering.