Maths for Signals and Systems
Problem Sheet 6
Problems
Consider a matrix A with eigenvalues 1 1 and 2 1 and eigenvectors x1 cos sin
T
1.
and x2 sin cos . Show that A AT , A2 I , det( A) 1 , A1 A .
T
Solution
The eigenvectors of this matrix are perpendicular to each other, since:
sin
x1 x2 cos sin 0.
T
cos
Furthermore, their magnitude is 1, since:
xi cos2 sin 2 1, i 1,2
2
For the above reasons, we conclude that the eigenvectors of matrix A are orthogonal and
therefore, A is a symmetric matrix and A AT .
Moreover, A can be diagonalised as A QQT with Q a matrix which contains the
eigenvectors of A in its columns, i.e., Q x1 x2 .
(1) 2 0 1 0
A2 QQT QQT Q2QT (note that QT Q I ) and 2 2
I , so that
0 1 0 1
A Q Q QIQ QQ I .
2 2 T T T
det( A) 12 (1) 1 1 .
Finally from A2 I we see that A1 A .
2. Consider a matrix A with A3 0 . Find the eigenvalues of A . Give an example of a matrix of
any size that satisfies A3 0 , with A 0 . In case that a matrix A satisfies A3 0 and is also
symmetric, prove that A 0 .
Solution
The eigenvalues of A3 are 3i , where i are the eigenvalues of A . This can be seen from the
fact that if i is an eigenvalue of A with corresponding eigenvector xi then
A3 xi A2 Axi A2i xi i AAxi i Ai xi 3i xi . But A3 xi 3i xi 0 and therefore,
3i xi 0 3i 0 i 0 . An example of a matrix which is non-zero but has zero eigenvalues
0 1
is A with A 0, n . In case where A is symmetric, the diagonalisation of A is
n
0 0
A QQT with Q an orthogonal matrix that contains the eigenvectors of A.
A QQ Q0Q 0 .
T T
1
3. Show that the eigenvalues of a symmetric matrix A with real entries are real.
Solution
Suppose that is an eigenvalue of A that corresponds to the eigenvector x . In that case we
have the following:
Ax x ( Ax) (x) A x x
Ax x since A is real.
In the above we transpose both sides and we get:
( Ax )T ( x )T ( Ax )T ( x )T
Now we multiply both sides from the right with x :
( Ax )T x ( x )T x ( x )T AT x ( x )T x ( x )T Ax ( x )T x , since A is symmetric.
( x )T x ( x )T x ( x )T x ( x )T x . Therefore, is real.
4. (i) A skew-symmetric (or antisymmetric) matrix B has the property BT B . Show that the
eigenvalues of a skew-symmetric matrix B with real entries are purely imaginary.
(ii) Show that the diagonal elements of a skew-symmetric matrix are 0.
Solution
(i) Bx x ( Bx) (x) B x x
Bx x since B is real.
In the above we transpose both sides and we get:
( Bx )T ( x )T ( Bx )T ( x )T
Now we multiply both sides from the right with x :
( Bx )T x ( x )T x ( x )T BT x ( x )T x ( x )T Bx ( x )T x , since B is skew-
symmetric.
( x )T x ( x )T x ( x )T x ( x )T x . Therefore, is purely
imaginary. The only real eigenvalue that a skew-symmetric matrix might have is the zero
eigenvalue.
(ii) We know that BT B and therefore if bij is a random element of B then bij b ji .
Therefore, for the diagonal elements we have bii bii and this gives bii 0 .
5. Consider the skew-symmetric matrix
0 1 1 1
1 0 1 1
1 .
M
3 1 1 0 1
1 1 1 0
(i) Show that Mv v , with v any 4-dimensional column vector. What observation can you
make out of this result?
(ii) Using the trace of M , the result of 5(i) and furthermore, the result of Problem 4 above, find
all four eigenvalues of M .
2
Solution
(i)
0 1 1 1 x yzw
1 0 1 1 y
Mv
1 1 x z w
3 1 1 0 1 z 3 x y w
1 1 1 0 w x y z
2 1
Mv ( y z w) 2 ( x z w) 2 ( x y w) 2 ( x y z ) 2
3
1 2
( y z 2 w 2 2 yz 2 zw 2 yw) ( x 2 z 2 w 2 2 xw 2 zw 2 xz)
3
1
( x 2 y 2 w 2 2 xw 2 yw 2 xy) ( x 2 y 2 z 2 2 xz 2 yz 2 xy)
3
1
3
1
( y 2 z 2 w2 ) ( x 2 z 2 w2 ) ( y 2 z 2 w2 ) ( x 2 z 2 w2 )
3
2
x 2 y 2 z 2 w 2 v Mv v .
If we take an eigenvalue of matrix M which correspond to the eigenvector v , we have
that Mv v Mv v . Based on the result Mv v we have
v v v 1 . Therefore, the eigenvalues of the given matrix have magnitude
1.
(ii) In Problem 4 we proved that the eigenvalues of a skew-symmetric matrix are purely
imaginary. For this particular case we also see that they have magnitude of 1. Therefore, for
matrix M we can say that the eigenvalues are i or i or 0. The matrix has four
eigenvalues and they sum up to zero. The determinant of M is not zero (it is actually
1
9 1 ) which means that the matrix is full rank and therefore, it doesn’t have any zero
3
4
eigenvalues. Therefore, two of them are equal to i and the rest are equal to i .
6. Consider matrices A and B shown below:
0 0 1 1 1 1
A 0 1 0 and B 1 1 1
1
3
1 0 0 1 1 1
(i) In which of these classes do they belong to? Invertible, orthogonal, projection, permutation,
diagonalisable, Markov.
(ii) Which of the factorisations LU , QR, SS 1 , QQ 1 are possible for A and B ?
Solution
(i) For A we have:
Eigenvalues are -1, 1, 1 and corresponding eigenvectors are 1 0 1 , 1 0 1 and
T T
0 1 0 . The eigenvectors are independent and also orthogonal. The determinant is -1.
T
A is invertible (its determinant is non-zero), orthogonal (its rows are orthogonal to each
other), permutation (obvious), diagonalisable (any invertible matrix is diagonalisable),
Markov (satisfies the Markov properties – rows/columns have positive elements which sum
up to 1). It is not a projection matrix because it doesn’t satisfy the property A2 A .
3
For B we have:
Eigenvalues are 1, 0, 0 and corresponding eigenvectors are 1 1 1 , 1 0 1 and
T T
1 1 0 . The eigenvectors are independent and also orthogonal. The determinant is 0.
T
B is projection ( B 2 B and is symmetric), diagonalizable (it has a set of independent
eigenvectors), Markov (satisfies the Markov properties – rows/columns have positive
elements which sum up to 1). It is not orthogonal or permutation.
(ii) For a 3 3 matrix the LU decomposition looks like:
a11 a12 a13 l11 0 0 u11 u12 u13
A a21 a22 a23 l21 l22 0 0 u 22 u23 LU
a31 a32 a33 l31 l32 l33 0 0 u33
For the given matrix A we have a11 0 and therefore, at least one of l11 and u11 has to be
zero. In that case either L or U is singular. This is not possible since A is not singular.
Therefore, A doesn’t have an LU decomposition. In order for A to have an LU
decomposition, we must reorder the rows of A , i.e., A must be multiplied from the left
with a permutation matrix P , and in that case we have PA LU .
A has a QR decomposition as follows:
0 0 1 0 0 1 1 0 0
A 0 1 0 0 1 0 0 1 0
1 0 0 1 0 0 0 0 1
For the given matrix B we find the LU decomposition using elimination, as follows:
1 1 1 1 0 0 1 1 1
1 1
B 1 1 1 1 1 0 0 0 0 , and therefore, B does have an LU
3 3
1 1 1 1 0 1 0 0 0
decomposition.
1 1 1 0.577 0.816 0 0.577 0.577 0.577
B 1 1 1 0.577 0.408 0.707 0
1 0 0 where the matrix
3
1 1 1 0.577 0.408 0.707 0 0 0
0.577 0.816 0
0.577 0.408 0.707
is orthogonal, therefore B does have a QR decomposition.
0.577 0.408 0.707
Both matrices have SS 1 decomposition since they have a set of independent
eigenvectors.
Both matrices have QQ 1 decomposition since, due to their symmetry, we can choose a
set of independent and also orthogonal eigenvectors.