MA2002 Summary Notes
Week 1
• 1.1 Linear Systems and their solutions
scope
• 1.2 Elementary Row Operations
• 1.3 Row-Echelon Forms
• 1.4 Gaussian Elimination
• What is a linear equation and a linear system?
• What is a general solution of a linear equation/system?
• What is the geometrical interpretation of a linear equation/syste and its
solutions?
• How to find a general solution of a linear equation?
• What are the three elementary row operations (ERO)?
objective
• How to perform ERO?
• What is meant by row equivalence?
• How to identify a row-echelon form (REF) and a reduced row-echelon
form (RREF)?
• How to use REF / RREF to get solutions of linear system?
• What are Gaussian elimination (GE) and Gauss-Jordan elimination
(GJE)?
• How to use GE / GJE to reduce an augmented matrix to a REF / RREF ?
• A linear equation with two or more variables has infinitely many solutions.
• A linear system has either no solution, exactly one solution, or infinitely many
solutions.
• Elementary row operations do not change the solution set of a linear system.
summary
• Two linear systems have the same solution set if their augmented matrices
are row equivalent.
• The solutions of a LS can be obtained from its REF
• An augmented matrix has many REF but only one RREF
• Given any matrix, we can always apply GE (resp. GJE) to reduce the matrix
into REF (resp. RREF)
• Exercise 1: 1 - 21
Ex
1 of 12
Week 2
• 1.4 Gaussian Elimination
scope
• 1.5 Homogeneous Linear System
• 2.1 Introduction to Matrices
• 2.2 Matrix Operations
• How to tell the number of solutions of linear system from REF?
• How to use GE / GJE to solve indirect linear system problems?
• What is a homogeneous system?
• What is a trivial / non-trivial solution of a homogeneous system?
• What are the size, entries, order of a matrix?
objective
• What are diagonal, identity, symmetric, triangular matrices?
• How to perform matrix addition, matrix multiplication, scalar
multiplication and transpose?
• How to express certain matrices and operations using (i, j)-entries?
• What are some properties of matrix operations?
• What are some different ways to express matrix multiplication?
• How to express linear system in matrix equation form?
• A LS has no solution if and only if the last column of its REF is a pivot column
• In a REF, # non-zero rows = # leading entries = # pivot columns
• In a consistent linear system,
• if # variables = # non-zero rows, then the system has exactly one soln
• if # variables > # non-zero rows, then the system has infinite solns
• A homogeneous system is always consistent, as it always has the trivial
solution.
• If a homogeneous system has a non-trivial solution, then it has infinitely many
summary
solutions.
• A homogeneous system with more variables than equations has infinitely
many solutions.
• We do not refer to solutions for a non-homogeneous system as trivial or non-
trivial.
• (i, j)-entry of AB = ai1b1j + ai2b2j + ··· + ainbnj
• Matrix multiplication: AB ≠ BA (in general)
• Matrix multiplication: AB = 0 does not imply A = 0 or B = 0
• Linear system can be expressed in matrix equation form and column form
• Matrix transpose: (AB)T = BTAT
• A is a symmetric matrix if and only if A = AT
• Exercise 1: 22 - 30
Ex
• Exercise 2: 1 - 24
2 of 12
Week 3
• 2.3 Inverses of Square Matrices
scope
• 2.4 Elementary Matrices
• 2.4 Elementary Matrices
• 2.5 Determinants
• What is an invertible matrix?
• What is the inverse of a matrix?
• What are the powers of a matrix?
• What are elementary matrices?
objective
• How are elementary matrices related to elementary row operations?
• How to find inverse of an elementary matrix?
• What are some different ways to show a matrix is invertible?
• How to find the inverse of an invertible matrix?
• What is the determinant of a matrix?
• What is cofactor expansion of a matrix?
• If A is invertible, then AB1 = AB2 ⇒ B1 = B2
• If A is invertible, then (AT )–1 = (A–1)T
• If A, B are invertible, then (AB)–1 = B –1A–1
• There are three types of elementary matrices
• All elementary matrices are invertible
summary
• A and B are row equivalent if A = En…E2E1B where all Ei are elementary
matrices.
• A is invertible ↔ RREF of A is the identity matrix I
• A is invertible ↔ Ax = 0 has only trivial solution
• Cofactor expansion of a matrix along any row (column) gives the determinant
• Determinant of a triangular (diagonal) matrix is the product of its diagonal
entries
• Exercise 2: 25 -47
Ex
3 of 12
Week 4
• 2.5 Determinants
scope
• 3.1 Euclidean n-spaces
• How do matrix operations affect determinants?
• What is the relation between invertibility and determinant?
• What is the adjoint of a matrix?
objective
• What is Cramer’s rule?
• What is an n-vector?
• What are some operations on n-vectors?
• What is a Euclidean n-space Rn?
• How to express subsets of Rn?
• det(A) = det(AT)
• If A has two identical rows (columns), then det(A) = 0
• Interchanging two rows/columns will change determinant by a negative sign
• Adding a multiple of a row (column) to another will not change determinant.
• A is invertible ↔ det(A) ≠ 0
• If A is n×n and c is a scalar, then det(cA) = cn det(A)
summary
• det(AB) = det(A)det(B)
• det(A-1) = 1/det(A)
• A-1 = [1/det(A)] adj(A)
• Cramer’s rule is a method to solve Ax = b when A is invertible
• 2-vectors and 3-vectors can be expressed geometrically and algebraically
• n-vectors (n > 3) can only be expressed algebraically
• Subsets of Rn can be expressed in implicit and explicit forms
• Lines/planes are subsets of R2 and R3
• Solution set of n variable LS is a subset of Rn
• Exercise 2: 48 - 61
Ex
• Exercise 3: 1 - 7
4 of 12
Week 5
• 3.2 Linear Combinations and Linear Spans
scope
• 3.3 Subspaces
• What is a linear combination?
• How to express a vector as a linear combination?
SUBSPACE
• What is a linear span?
objective
S2/SPAN S2
• What is a subspace? SUBSPA
CE S1/
• What are some examples of subspaces of Rn? SPAN S1
• What is a solution space of a linear system?
• How to show a linear span is contained in another?
• Linear span (of v1, v2, …, vn) = the set of all linear combinations (of v1, v2, …,
vn)
the requirement for • A subset of Rn is a subspace if it is a linear span of some fixed n-vectors
a subspace is that • A subspace of Rn always contains the zero vector
for v,u in the
subspace, cv+du is • Any linear combination of vectors in a subspace V is again a vector in V.
summary
still in the same • {0} and Rn are subspaces of Rn
subspace • In R2 and R3, span{u} is a line if u ≠ 0; span{u, v} is a plane if u not parallel
to v.
• The solution set of a homogeneous system with n variables is a subspace of
Rn .
• To show span(S1) ⊆ span(S2), just need to show every vector in S1 is a linear
combination of vectors in S2. S1 is a subspace in S2
• If u ∈ span(S) , then span(S) = span(S ∩ u)
• Exercise 3: 8 - 24
Ex
5 of 12
Week 6
• 3.4 Linear Independence
scope
• 3.5 Bases
• What is a linearly independent/dependent set?
• How to show that a set is linearly (in)dependent?
objective
• What are some conditions on linearly (in)dependent sets?
• What is a basis for a vector space?
• How to show that a set is a basis?
• How to find a basis for a vector space?
• What are coordinate vectors?
• u and v are scalar multiples of each other ↔ {u, v} is linearly dependent.
• If S contains 0, then S is linearly dependent.
• S is linearly dependent ↔ at least one vector in S is a linear combination of
the other vectors in S
• {u, v} ⊆ R2 are linearly independent ↔ span{u, v} = R2
• {u, v, w} ⊆ R3 are linearly independent ↔ span{u, v, w} = R3
summary
• If S ⊆ Rn and S has more than n elements, then S is linearly dependent.
• Rn has the standard basis (for all n)
• Every non-zero vector space has infinitely many different bases
• All bases for the same vector space V has the same number of vectors (called
the dimension of V).
• Every vector in a vector space can be expressed as linear combination of a
basis in a unique way
• S is a basis for span(S) ↔ S is linearly indep
• Exercise 3: 25 - 35
Ex
6 of 12
Week 7
• 3.6 Dimensions
scope
• 3.7 Transition Matrices
• What is the dimension of a vector space?
• How to compute dimension for a vector space?
objective
• What are some conditions for a set to be a basis for a vector space?
• What is a transition matrix?
• How to compute transition matrices?
• What is the relation between same coordinate vectors w.r.t. different bases?
• If S has more than dim(V) of vectors, then S is linearly dependent
• If S has less than dim(V) of vectors, then S cannot span V
• dim(solution space) = # parameters in gen. soln.
• W ⊆ V ⇒ dim(W) ≤ dim(V)
• W ⊆ V and dim(W) = dim(V) ⇒ W = V
summary
• S is linearly independent and |S| = dim V ⇒ S is a basis for V
• S spans V and |S| = dim V ⇒ S is a basis for V
• A is an invertible n×n matrix ↔ rows (columns) of A form a basis for Rn
• Suppose P is the transition matrix from S to T. Then
[w]T = P [w]S for any vector w in V
P is invertible
P –1 is the transition matrix from T to S
• Exercise 3: 36 - 49
Ex
7 of 12
Week 8
• 4.1 Row spaces and Column spaces
scope
• 4.2 Ranks
• 4.3 Nullspaces and Nullities
• What are row space and column space of a matrix?
• How to find bases for row /column spaces?
• How to use row /column spaces to find bases for vector spaces?
• How to extend a basis?
objective
• What is the relation between column space and consistency of linear system?
• What is the rank of a matrix?
• What is the relation between rank and invertibility of a matrix?
• What is the relation between rank and consistency of linear system?
• What is the nullspace and nullity of a matrix?
• What is the Dimension Theorem?
• What is the relation between nullspace and solution set of a linear system?
• The row space (resp. column space) of an m x n matrix is a subspace of Rn
(resp. Rm)
• Row operations preserve row space but do not preserve column space
• If R is an REF of A, then the non-zero rows of R form a basis for row space of
A
• Those columns of A that correspond to the pivot columns in an REF form a
basis for the column space of A
summary
• A basis for span(S) can be found using row space or column space methods
• We can extend a basis using row space method
• Row space and column space of a matrix have the same dimension (rank)
• Largest possible rank of an m×n matrix is min{m,n}
• An n×n matrix A is invertible ↔ rank(A) = n ↔ nullity(A) = 0
• Dimension Theorem: rank(A) + nullity(A) = # of columns of A
• For the linear system Ax = b
• if b belongs to column space of A, system is consistent
• solution set of the system= (Nullspace of A) + (fix solution of Ax = b)
• Exercise 4: 1 - 26
Ex
8 of 12
Week 9
• 5.1 Inner Products in Rn
scope
• 5.2 Orthogonal and Orthonormal Bases
• What are the algebraic representation of length, distance and angles in Rn?
• What is the dot product of vectors?
objective
• What is an orthogonal/orthonormal set?
• How to normalize a vector?
• What are the properties of orthogonal sets?
• What is the projection of a vector onto a subspace?
• What is Gram-Schmidt Process?
• u·v = uvT (RHS is matrix multiplication, u, v as rows)
• u·v = uTv (RHS is matrix multiplication, u, v as columns)
• u·u = 0 ↔ ||u|| = 0 ↔ u = 0
If u is non-zero, then ||u|| u is a unit vector
1
•
• If S is an orthogonal set of nonzero vectors, then S is linearly independent
• If S = {u1, u2, …, uk} is an orthogonal basis for V and w ∈ V, then
w ∙ u1 w ∙ u2 w ∙ u𝑘𝑘
w= u1 + u2 + ⋯ + u
‖u1 ‖2 ‖u2 ‖2 ‖u𝑘𝑘 ‖2 𝑘𝑘
summary
• If p is the projection of w onto a subspace V, then
• w - p is orthogonal to V
• d(w, p) ≤ d(w, v) for any vector v in V
• If S = {u1, u2, …, uk} is an orthogonal basis for V, then the projection of w
onto V is
w ∙ u1 w ∙ u2 w ∙ u𝑘𝑘
p= u1 + u 2 + ⋯ + u
‖u1 ‖2 ‖u2 ‖2 ‖u𝑘𝑘 ‖2 𝑘𝑘
• Gram-Schmidt Process converts any basis for a vector space to an orthogonal
(orthonormal) basis
• Exercise 5: 1 - 20
Ex
9 of 12
Week 10
• 5.3 Best Approximation
scope
• 5.4 Orthogonal Matrices
• 6.1 Eigenvalues and Eigenvectors
• What is a Least Squares solution?
• How to find the best approximate solution to inconsistent system?
• What is an orthogonal matrix?
objective
• How is orthogonal matrix related to orthonormal basis?
• How is transition matrix related to orthogonal matrix?
• What are eigenvalue, eigenvectors and eigenspace?
• How to find eigenvalues and eigenvectors of a matrix?
• How to find basis for eigenspace of a matrix?
• How is eigenvalue related to invertibility of matrix?
• The least squares solutions to Ax = b is given by
• the solutions of ATAx = ATb
• the solutions of Ax = p
• The inverse of an orthogonal matrix is its transpose
• Rows/columns of n x n orthogonal matrix form orthonormal basis for Rn
summary
• Transition matrix between two orthonormal bases is an orthogonal matrix
• λ is an eigenvalue of A ⇔ det(λI – A) = 0
• 0 is an eigenvalue of A ⇔ det(A) = 0
• A is invertible ⇔ 0 is not an eigenvalue of A.
• The eigenvalues of a triangular matrix are the diagonal entries.
• The eigenvectors of A are the solutions of (λI – A) x = 0
• If u and v are eigenvectors of A associated with the same eigenvalue λ, then
u + v is an eigenvector of A.
• Exercise 5: 21 - 34
Ex
• Exercise 6: 1 - 8
10 of 12
Week 11
• 6.2 Diagonalization
scope
• 6.3 Orthogonal Diagonalization
• What is a diagonalizable matrix?
• How to determine if a matrix is diagonalizable?
• How to diagonalize a matrix?
objective
• How to compute powers of matrix using diagonalization?
• How to solve linear recurrence relation using diagonalization?
• What is orthogonal diagonalization?
• What is the characterization of a matrix that is orthogonally diagonalizable?
• How to orthogonally diagonalize a symmetric matrix?
• An n × n matrix A is diagonalizable ↔ A has n linearly independent
eigenvectors
• A set of eigenvectors associated with different eigenvalues are linearly
independent
• If geometric multiplicity < algebraic multiplicity for some eigenvalue, the
matrix is not diagonalizable.
• If an n × n matrix A has n distinct eigenvalues, then A is diagonalizable.
• If A is diagonalizable, then
summary
𝜆𝜆1𝑚𝑚
𝜆𝜆𝑚𝑚
A𝑚𝑚 = P ⎛ 2 ⎞ P−1
⋱
⎝ 𝜆𝜆𝑚𝑚
𝑛𝑛 ⎠
where P is the matrix of eigenvectors and λi are the eigenvalues
• A matrix is orthogonally diagonalizable if and only if it is symmetric.
• If A is a symmetric matrix, and u, v are two eigenvectors of A associated with
distinct eigenvalues, then u and v are orthogonal
• Exercise 6: 9 - 30
Ex
11 of 12
Week 12
• 7.1 Linear Transformations from Rn to Rm
scope
• 7.2 Ranges and Kernel
• What is a linear transformation?
• How are linear transformations related to matrices?
• What are the conditions of a linear transformation?
objective
• How to use basis to determine linear transformation?
• What is the composition of linear transformations?
• What are the range and kernel of a linear transformation?
• What are the rank and nullity of a linear transformation?
• What is the Dimension Theorem of linear transformation?
• A linear transformation T : Rn → Rm
• is a mapping between two vector spaces
• is defined by matrix multiplication
• maps zero vector to zero vector
• preserves linear combinations
• If A is the standard matrix of T, then T(u) = Au for all u ∈ Rn
• If {u1, u2, …, un} is a basis for Rn, then T is completely determined by
summary
the images T(u1), T(u2), …, T(un)
• The standard matrix of the composition T ∘ S is the product of the
standard matrices of T and S.
• T: R n
→ Rm linear transformation with standard matrix A
• Range of T = column space of A (subspace of Rm)
• Kernel of T = nullspace of A (subspace of Rn)
• rank(T) = rank(A)
• nullity(T) = nullity(A)
• rank(T) + nullity(T) = n
• Exercise 7: 1 - 17
Ex
12 of 12