LA 02 Solving Linear Systems 2025
LA 02 Solving Linear Systems 2025
Preliminaries
Echelon Form of a Matrix
Elementary Matrices
Finding A-1
Equivalent Matrices
LU-Factorization
Applications: Polynomial Curve Fitting
Applications: The Global Positioning System (GPS)
Applications: Linear Regression by Ordinary Least Squares (OLS)
Productivity Optimization Lab@NTU LA02_ Solving Linear Systems Chia-Yen Lee, Ph.D. 2
Topics
Preliminaries
Echelon Form of a Matrix
Elementary Matrices
Finding A-1
Equivalent Matrices
LU-Factorization
Applications: Polynomial Curve Fitting
Applications: The Global Positioning System (GPS)
Applications: Linear Regression by Ordinary Least Squares (OLS)
Productivity Optimization Lab@NTU LA02_ Solving Linear Systems Chia-Yen Lee, Ph.D. 3
Preliminaries
Productivity Optimization Lab@NTU LA02_ Solving Linear Systems Chia-Yen Lee, Ph.D. 4
Topics
Preliminaries
Echelon Form of a Matrix
Elementary Matrices
Finding A-1
Equivalent Matrices
LU-Factorization
Applications: Polynomial Curve Fitting
Applications: The Global Positioning System (GPS)
Applications: Linear Regression by Ordinary Least Squares (OLS)
Key Terms:
Row echelon form 階梯矩陣
Reduced row echelon form 簡化階梯矩陣
Productivity Optimization Lab@NTU LA02_ Solving Linear Systems Chia-Yen Lee, Ph.D. 5
Echelon Form of a Matrix
Solve a system using elimination of variables
x1 + x2 + 2x3 = −1 x1 + x2 + 2x3 = −1
x1 − 2x2 + x3 = −5 0 − 3x2 − x3 = −4
3x1 + x2 + x3 = 3 0 − 2x2 − 5x3 = 6
x1 + x2 + 2x 3 = −1 x1 + x2 + 2x 3 = −1
0 + 3x2 + x3 = 40 + 6x2 + 2x 3 = 8
0 + 2x2 + 5x 3 = −6 0 + 6x2 + 15x 3 = −18
x1 + x2 + 2x3 = −1 x1 + x2 + 2x3 = −1
0 + 6x2 + 2x3 = 80 + 6x2 + 2x3 = 8
0 + 0 + 13x3 = −26 0 + 0 + x3 = −2
Pivot Columns
1 1 2 −1 1 1 2 −1 1 1 2 −1
1 −2 1 −5 0 −3 −1 −4 0 −3 −1 −4
3 1 1 3 0 −2 −5 3 0 0 −13 3 26 3
Pivots (主元/軸/樞)
1 1 2 −1 1 0 0 1
0 1 1 3 4 3 0 1 0 2
0 0 1 −2 0 0 1 −2
Productivity Optimization Lab@NTU LA02_ Solving Linear Systems Chia-Yen Lee, Ph.D. 8
Echelon Form of a Matrix
Definition: - An m x n matrix A is said to be in reduced row echelon form
if it has the following properties
a) If any rows consist of entirely zeros, they are at the bottom of the
matrix (全0的row在最下面)
b) If a row does not consist entirely of zeros, the first nonzero entry in
that row is 1 (不是全0的row,第一個非零元素是1)
c) If rows i and i + 1 are two successive rows which do not consist
entirely of zeros, then the first nonzero entry of row i + 1 is to the right
of the first nonzero entry of row i (上下兩個相鄰的row, 下面那個
row的第一個非零元素在上面那個row的第一個非零元素的右邊)
d) If a column contains the first nonzero entry of some row, then all the
other entries in that column are zero (如果一個column有某個row的
第一個非零元素,則這個column的其他元素為零)
Productivity Optimization Lab@NTU LA02_ Solving Linear Systems Chia-Yen Lee, Ph.D. 9
Echelon Form of a Matrix
Productivity Optimization Lab@NTU LA02_ Solving Linear Systems Chia-Yen Lee, Ph.D. 10
Echelon Form of a Matrix
1 0 0 0 1 0 0 0 −2 4
0 1 0
0 0 1 0 0 4 8
0 0 1 0 0 0 0 1 7 −2
0 0 0 1 0 0 0 0 0 0
0 0 0 0 0 0
1 2 0 0 1
0 0 1 2 3
0 0 0 0 0
Productivity Optimization Lab@NTU LA02_ Solving Linear Systems Chia-Yen Lee, Ph.D. 11
Your Turn
True or false: the following matrices are in reduced row
echelon form
(1) (2)
1 2 0 4 1 0 3 4
0 0 0 0 0 2 − 2 5
0 0 1 −3 0 0 1 2
(3) (4)
0 0 0 1 0 0 1 2 3 4
0 1 −2 5
0 0 0 0 1 0
0 0 0 0 0 1 0 0 1 2
0 0 0 0 0 0
0 0 0 0
Productivity Optimization Lab@NTU LA02_ Solving Linear Systems Chia-Yen Lee, Ph.D. 12
Echelon Form of a Matrix
Productivity Optimization Lab@NTU LA02_ Solving Linear Systems Chia-Yen Lee, Ph.D. 13
Elementary Row Operation (基本列運算)
Productivity Optimization Lab@NTU LA02_ Solving Linear Systems Chia-Yen Lee, Ph.D. 14
Elementary Row Operation
Comment
⚫ The first elementary row operation, exchange two rows, is needed to deal with zeros in
the pivot positions. For example, consider
2 x2 = 1 0 2 x1 1 0 2 1
=
3x1 + 4 x2 = 2 3 4 x2 2 3 4 2
⚫ No multiple of the first row will remove the 3 in the second row. The solution with
equations is just to exchange the equations. The solution with matrices is just to
exchange the rows.
3x1 + 4 x2 = 2 3 4 x1 2 3 4 2
=
2 x2 = 1 0 2 x2 1 0 2 1
⚫ This problem can also occur as an intermediate step in solving a larger system and has
the same fix
Productivity Optimization Lab@NTU LA02_ Solving Linear Systems Chia-Yen Lee, Ph.D. 15
Row Equivalence
Productivity Optimization Lab@NTU LA02_ Solving Linear Systems Chia-Yen Lee, Ph.D. 16
What do you think?
Productivity Optimization Lab@NTU LA02_ Solving Linear Systems Chia-Yen Lee, Ph.D. 17
What do you think? (Discussion)
Solution
Productivity Optimization Lab@NTU LA02_ Solving Linear Systems Chia-Yen Lee, Ph.D. 18
Using elementary row operations to solve linear systems
Theorem 2.3: Let A and C be m x n matrices. If the system AX = b
has an augmented matrix [ A:b ] which is row equivalent to an
augmented matrix [ C:d ], then the systems AX = b and CX = d
have the same solution.
QED
Productivity Optimization Lab@NTU LA02_ Solving Linear Systems Chia-Yen Lee, Ph.D. 19
Using elementary row operations to solve linear systems
Theorem 2.3
Since performing elementary row operations on the augmented
matrices correspond to the following operations on the equations
a) Interchange two equations
b) Multiply a given equation by a nonzero constant
c) Add a constant times one equation to another equation
❑ ➔Two row equivalent augmented matrices corresponds to two
equivalent linear systems (same solutions)
If the system AX = b has an augmented matrix [ A:b ] which is
row equivalent to an augmented matrix [ C:d ], then the systems
AX = b and CX = d have the same solution
⚫ A and C are m x n matrices.
Productivity Optimization Lab@NTU LA02_ Solving Linear Systems Chia-Yen Lee, Ph.D. 20
Your Turn
Discussion
QED
Productivity Optimization Lab@NTU LA02_ Solving Linear Systems Chia-Yen Lee, Ph.D. 21
22
Back substitution
Example
1 2 3 4 5
C : D = 0 1 2 3 6
0 0 0 0 1
Productivity Optimization Lab@NTU LA02_ Solving Linear Systems Chia-Yen Lee, Ph.D. 23
Echelon Form of a Matrix
Homogeneous Systems
Homogeneous system, AX = 0, of m equations in n unknowns
occurs often applications
Example (1) How many free variables in the solution
of this system?
1 0 0 0 2 0
0 0 1 0 3 0
0 0 0 1 4 0 (2) What does it mean to have free variables?
0 0 0 0 0 0
Productivity Optimization Lab@NTU LA02_ Solving Linear Systems Chia-Yen Lee, Ph.D. 24
The Case of m<n (self-learning)
Homogeneous Systems
Theorem 2.4: A homogeneous system of m linear equations
in n unknowns always has a nontrivial solution if m < n, i.e. if
the number of unknowns exceeds the number of equations.
A B
𝑎11 𝑎12 ⋯ 𝑎1𝑛 0 1 𝑏12 0 ⋯ 0 0
𝑎2𝑛 𝑎22 ⋯ 𝑎2𝑛 0 0 0 1 ⋯ 0 0
➔
⋱ ⋮ ⋱ ⋮
𝑎𝑚1 𝑎𝑚2 ⋯ 𝑎𝑚𝑛 0 0 0 0 ⋯ 1 0
Productivity Optimization Lab@NTU LA02_ Solving Linear Systems Chia-Yen Lee, Ph.D. 25
Echelon Form of a Matrix (self-learning)
Homogeneous Systems
Proof (continue): Let z be the number of nonzero rows of B.
Then z ≤ m. Since m < n, then z < n. So, we are solving z
equations in n unknowns and can solve for z of the unknowns
in terms of the remaining n - z unknowns, which can take any
values. So, BX = 0 and thus AX = 0 have nontrivial solutions.
A B
𝑎11 𝑎12 ⋯ 𝑎1𝑛 0 1 𝑏12 0 ⋯ 0 0
𝑎2𝑛 𝑎22 ⋯ 𝑎2𝑛 0 0 0 1 ⋯ 0 0
➔
⋱ ⋮ ⋱ ⋮
𝑎𝑚1 𝑎𝑚2 ⋯ 𝑎𝑚𝑛 0 0 0 0 ⋯ 1 0
Productivity Optimization Lab@NTU LA02_ Solving Linear Systems Chia-Yen Lee, Ph.D. 26
Topics
Preliminaries
Echelon Form of a Matrix
Elementary Matrices
Finding A-1
Equivalent Matrices
LU-Factorization
Applications: Polynomial Curve Fitting
Applications: The Global Positioning System (GPS)
Applications: Linear Regression by Ordinary Least Squares (OLS)
Productivity Optimization Lab@NTU LA02_ Solving Linear Systems Chia-Yen Lee, Ph.D. 27
Elementary Matrices
Definition: - An n x n elementary matrix of Type I, Type II or
Type III is a matrix obtained from the identity matrix In by
performing a single elementary row operation of Type I,
Type II or Type III, respectively
⚫ Type I - exchange two rows
⚫ Type II - multiply a row by a nonzero constant
⚫ Type III - add a multiple of one row to another
1 0 0 1 0 0
(a ) 0 3 0 (b) 1 0 0 (c) 0 1 0
0 1 0
0 0 1 0 0 0
1 0 0 1 0 0
(d ) 0 0 1 (e) 1 0 ( f ) 0 2 0
0 1 0 2 1 0 0 −1
Productivity Optimization Lab@NTU LA02_ Solving Linear Systems Chia-Yen Lee, Ph.D. 29
Elementary Matrices (Type I)
Elementary Row Operations
Interchange rows i and j of matrix A
0 1 0 a11 a12 a13 a14 a21 a22 a23 a24
1 0 0 a21 a22 a23 a24 = a11 a12 a13 a14
0
0 1 a31 a32 a33 a34 a31 a32 a33 a34
In general, rows i and j of A can be interchanged by
multiplying A by the m x m matrix E defined as
1 if k i and k j
e pq = 0 except for ekk =
0 if k = i or k = j
and eij = 1, e ji = 1
E is a special case of a permutation matrix
Productivity Optimization Lab@NTU LA02_ Solving Linear Systems Chia-Yen Lee, Ph.D. 30
Elementary Matrices (Type II)
Productivity Optimization Lab@NTU LA02_ Solving Linear Systems Chia-Yen Lee, Ph.D. 31
Elementary Matrices (Type III)
Productivity Optimization Lab@NTU LA02_ Solving Linear Systems Chia-Yen Lee, Ph.D. 32
Additional Comments on Elementary Matrices
Comments
The appropriate elementary matrix E may be obtained by
performing the desired operation on the rows of the m x m
identity matrix Im
Productivity Optimization Lab@NTU LA02_ Solving Linear Systems Chia-Yen Lee, Ph.D. 33
Elementary Matrices
Theorem 2.5: Let A be an m x n matrix and let an elementary row
operation of Type I, Type II or Type III be performed on A to yield
matrix B. Let E be the elementary matrix obtained from Im by
performing on it the same elementary row operation that was
performed on A. Then B = EA.
Productivity Optimization Lab@NTU LA02_ Solving Linear Systems Chia-Yen Lee, Ph.D. 34
Elementary Matrices and Row Equivalence
Theorem 2.6 - If A and B are m x n matrices, then A is row
𝑘
equivalent to B if and only if 𝑩 = 𝑬𝒌 𝑬𝒌−𝟏 … 𝑬𝟐 𝑬𝟏 𝑨, where 𝑬𝑖 𝑖=1
are m x m elementary matrices.
Productivity Optimization Lab@NTU LA02_ Solving Linear Systems Chia-Yen Lee, Ph.D. 36
Your Turn
Productivity Optimization Lab@NTU LA02_ Solving Linear Systems Chia-Yen Lee, Ph.D. 37
Inverse of Elementary Matrices
0 1 0 0 1 0
E1 = 1 0 0 = R12 ( R12 ) = E = 1 0 0 = R12
−1
1
−1
(Elementary Matrix)
0 0 1 0 0 1
Type III
1 0 0 1 0 0
E2 = 0 1 0 = R13( −2 ) (R ( −2 ) −1
13
−1
) = E = 0 1 0 = R13( 2 ) (Elementary Matrix)
2
− 2 0 1 2 0 1
1 0 0 1
( )
1
( ) 1 0 0
E3 = 0 1 0 = R3 2 ) = E = 0 1 0 = R3 (Elementary Matrix)
2 −1 −1 ( 2)
(R 3 3
0 0 12 0 0 2
Type II
Productivity Optimization Lab@NTU LA02_ Solving Linear Systems Chia-Yen Lee, Ph.D. 38
Inverse of Elementary Matrices
Proof
⚫ Type I - Let E switch rows i and j. Then the effects of E may be
undone by applying E again, i.e. EE = I. So, E is invertible and E-1 = E
⚫ Type II - Let E multiply the ith row by c ≠ 0. Let F multiply the ith row
by 1 / c. Then EF = FE = I. So, E is invertible and E-1 = F
⚫ Type III - Let E add c times the ith row to the jth row. Let F add (-c)
times the ith row to the jth row. Then EF = FE = I. So, E is invertible
and E-1 = F
Productivity Optimization Lab@NTU LA02_ Solving Linear Systems Chia-Yen Lee, Ph.D. 39
Inverse of Elementary Matrices
Lemma 2.1: Let A be a an n x n matrix and let the homogeneous system AX = 0 have
only the trivial solution X = 0 . Then A is row equivalent to In .
Productivity Optimization Lab@NTU LA02_ Solving Linear Systems Chia-Yen Lee, Ph.D. 40
Your Turn
1 0 0 0 0
0 1 0 0 0
What is the inverse of 0 0 1 0 0?
0 71 0 1 0
0 0 0 0 1
Productivity Optimization Lab@NTU LA02_ Solving Linear Systems Chia-Yen Lee, Ph.D. 41
Elementary Matrices
Productivity Optimization Lab@NTU LA02_ Solving Linear Systems Chia-Yen Lee, Ph.D. 42
Animated Linear Algebra
Linear transformations and matrices | Chapter 3, Essence of linear algebra (https://www.youtube.com/watch?v=kYB8IZa5AuE)
Productivity Optimization Lab@NTU LA02_ Solving Linear Systems Chia-Yen Lee, Ph.D. 43
Topics
Preliminaries
Echelon Form of a Matrix
Elementary Matrices
Finding A-1
Equivalent Matrices
LU-Factorization
Applications: Polynomial Curve Fitting
Applications: The Global Positioning System (GPS)
Applications: Linear Regression by Ordinary Least Squares (OLS)
Productivity Optimization Lab@NTU LA02_ Solving Linear Systems Chia-Yen Lee, Ph.D. 44
Finding A-1
Eg: From A to I, and finding 𝐴−1
Find a sequence of elementary matrices whose product is
− 1 − 2
A=
Sol:
3 8
− 1 − 2 r1( −1) 1 2 r12( −3 ) 1 2
A= ⎯⎯
⎯→ ⎯⎯ ⎯→
3 8 3 8 0 2
1 2 r21( −2 ) 1 0
1
( ) 1
⎯⎯→
r2 2
⎯⎯⎯→ = I Therefore R21
( )
( −2 )
R2 2 R12( −3) R1( −1) A = I
0 1 0 1
1
( )
Thus A = ( R1( −1) ) −1 ( R12( −3) ) −1 ( R ) ( R21
2
2 −1
( −2 ) −1
)
( −1) − 1 0 1 0 1 0 1 2
=R 1
( 3)
R R R
12
( 2)
2
( 2)
21 = 3 1 0 2 0 1
0 1
Productivity Optimization Lab@NTU LA02_ Solving Linear Systems Chia-Yen Lee, Ph.D. 45
Finding A-1
−1 −1 −1
Then 𝑨−1 = 𝑬1−1 𝑬−1
2 ⋯ 𝑬 𝑬
𝑘−1 𝑘 = 𝑬𝑘 𝑬𝑘−1 ⋯ 𝑬2 𝑬1 So,
A-1 can be represented as the product of the elementary
matrices that reduce A to In
Create the partitioned matrix 𝑨 𝑰𝑛 and apply the
operations that reduce A to In
Ek Ek −1 E2E1 A I n = I n A −1
Productivity Optimization Lab@NTU LA02_ Solving Linear Systems Chia-Yen Lee, Ph.D. 46
Finding A-1
Example 1 1 3
Compute inverse of A = 1 2 3
0 1 1
1 1 3 1 0 0 1 1 3 1 0 0
1 2 3 0 1 0 0 1 0 −1 1 0
0 1 1 0 0 1 0 1 1 0 0 1
1 1 3 1 0 0 1 1 0 −2 3 −3
0 1 0 −1 1 0 0 1 0 −1 1 0
0 0 1 1 −1 1 0 0 1 1 −1 1
1 0 0 −1 2 −3 −1 2 −3
0 1 0 −1 1 0 A −1 = −1 1 0
0 0 1 1 −1 1 1 −1 1
Productivity Optimization Lab@NTU LA02_ Solving Linear Systems Chia-Yen Lee, Ph.D. 47
Finding A-1
Equivalent statements for n x n matrix A
(proof omitted)
1. A is nonsingular
2. AX = 0 has the trivial solution only. (implied by 1.)
3. A is row (column) equivalent to In. (implied by 2.)
4. The system AX = B has a unique solution for every n x 1
matrix B (implied by 1.)
5. A is a product of elementary matrices. (implied by 3.)
6. det(A)≠ 0. (will introduced in Chapter 3)
Productivity Optimization Lab@NTU LA02_ Solving Linear Systems Chia-Yen Lee, Ph.D. 48
Finding A-1
Productivity Optimization Lab@NTU LA02_ Solving Linear Systems Chia-Yen Lee, Ph.D. 49
Finding A-1
Productivity Optimization Lab@NTU LA02_ Solving Linear Systems Chia-Yen Lee, Ph.D. 50
Finding A-1
Proof
⚫ Let A be singular. Apply Gauss-Jordan reduction to A to get a
matrix B in reduced row echelon form which is row equivalent to A.
B cannot be In since if it were, A would be nonsingular. Since B is
in reduced row echelon form, B must have at least one row of
zeros at the bottom.
(… continue next slide)
Productivity Optimization Lab@NTU LA02_ Solving Linear Systems Chia-Yen Lee, Ph.D. 51
Finding A-1
Proof
⚫ Let a A be row equivalent to a matrix B that has a row of all zeros.
⚫ [First show that B is singular]
⚫ The matrix B is singular since there does not exist a matrix C such that BC
= CB = In .
⚫ To see this, let the ith row of B consist of all zeros. The ith row of BC is
generated by taking the ith row of B and multiplying each column of C by it.
So, the ith row of BC is all zeros, but BC = In . So, B must be singular.
𝑏11 𝑐11 1
1
0 0 ⋯ 0 ⋯ = 0 0 ⋯ 0
i-th row i-th row
𝑏𝑛𝑛 𝑐𝑛1 𝑐𝑛𝑛 0 1
Productivity Optimization Lab@NTU LA02_ Solving Linear Systems Chia-Yen Lee, Ph.D. 53
Finding A-1
Theorem 2.11: If A and B are n x n matrices such that AB = In,
then BA = In . Thus A is nonsingular and B = A-1
Preliminaries
Echelon Form of a Matrix
Elementary Matrices
Finding A-1
Equivalent Matrices
LU-Factorization
Applications: Polynomial Curve Fitting
Applications: The Global Positioning System (GPS)
Applications: Linear Regression by Ordinary Least Squares (OLS)
Productivity Optimization Lab@NTU LA02_ Solving Linear Systems Chia-Yen Lee, Ph.D. 55
Equivalent Matrices
Defn - If A and B are two m x n matrices, then A is equivalent to B if
B is obtained from A by a finite sequence of elementary row or
elementary column operations
Preliminaries
Echelon Form of a Matrix
Elementary Matrices
Finding A-1
Equivalent Matrices
LU-Factorization
Applications: Polynomial Curve Fitting
Applications: The Global Positioning System (GPS)
Applications: Linear Regression by Ordinary Least Squares (OLS)
Productivity Optimization Lab@NTU LA02_ Solving Linear Systems Chia-Yen Lee, Ph.D. 57
LU-Factorization
Gauss-Jordan reduction works fine for small systems. For large systems, there are
more efficient methods that do not require getting the reduced row echelon form, or
even the row echelon method.
If the coefficient matrix has either of the triangular forms, the system is easy to solve.
Will develop a method for solving a general system AX = B by a method based on
these observations.
Productivity Optimization Lab@NTU LA02_ Solving Linear Systems Chia-Yen Lee, Ph.D. 58
LU-Factorization
Observations
Upper triangular system AX = B
Observations
Lower triangular system AX = B
a 0 0 0 b1
11
a21 a22 0 0 b2
a a32 a33 0 b3 aii 0 for 1 i n
31
a an 2 an3 ann bn System may be solved by
n1 forward substitution
a11x1 = b1 x1 = b1 a11
a21x1 + a22 x2 = b2 x2 = (b2 − a21x1 ) a22
j −1
x j = b j − a jk xk
=
a jj j = 2,3, , n
k 1
Productivity Optimization Lab@NTU LA02_ Solving Linear Systems Chia-Yen Lee, Ph.D. 60
LU-Factorization
Productivity Optimization Lab@NTU LA02_ Solving Linear Systems Chia-Yen Lee, Ph.D. 61
LU-Factorization
Example - Consider AX = B
6 −2 −4 4 x1 2
3 −3 −6 1 x2 −4
=
−12 8 21 −8 x3 8
−6 0 −10 7 x4 −43
Productivity Optimization Lab@NTU LA02_ Solving Linear Systems Chia-Yen Lee, Ph.D. 62
LU-Factorization
Example (continued)
A = LU, let Z = UX and solve LZ = B
z1 = 2,
1 0 0 0 z1 2 1 z + z = −4 z = − 5
1 2 1 0 0 z2 −4 2 1 2 2
=
−2 −2 1
0 z3 8 −2 z1 − 2 z2 + z3 = 8 z3 = 2
−1 1 −2 1 z4 −43 − z1 + z2 − 2 z3 + z4 = −43 z4 = −32
Questions
How do we find L and U?
Productivity Optimization Lab@NTU LA02_ Solving Linear Systems Chia-Yen Lee, Ph.D. 64
LU-Factorization
If the nn matrix A can be written as the product of a lower triangular matrix L
and an upper triangular matrix U, then A=LU is an LU-factorization of A
Ek E2 E1 A = U
A = E1−1 E2−1 Ek−1U
A = LU
Productivity Optimization Lab@NTU LA02_ Solving Linear Systems Chia-Yen Lee, Ph.D. 65
LU-Factorization
Eg: LU-factorization
1 − 3 0
(a) A = 1 2 (b) A = 0 1 3
1 0 2 − 10 2
Sol: (a)
A= 1 2 ⎯⎯→ 1 2 = U
r12(-1)
1 0 0 − 2
R12( −1) A = U
A = ( R12( −1) ) −1U = LU
( −1) −1 1 0
L = (R 12 ) =R (1)
12 =
1 1
Productivity Optimization Lab@NTU LA02_ Solving Linear Systems Chia-Yen Lee, Ph.D. 66
LU-Factorization
Eg: (LU-factorization)
1 − 3 0
(a) A = 1 2 (b) A = 0 1 3
1 0 2 − 10 2
(b) 1 − 3 0 1 − 3 0 1 − 3 0
A= 0
1 3 ⎯⎯r13( −2 )
⎯→ 0
1 3 ⎯⎯→ 0 1 3 = U
(4)
r23
2 − 10 2 0 − 4 2 0 0 14
( 4 ) ( −2 )
R23 R13 A = U
Ax = b If A = LU , then LUx = b
Let y = Ux, then Ly = b
Two steps:
Productivity Optimization Lab@NTU LA02_ Solving Linear Systems Chia-Yen Lee, Ph.D. 68
LU-Factorization
1 − 3 0 1 0 0 1 − 3 0
A = 0 1 3 = 0 1 0 0 1 3 = LU
2 − 10 2 2 − 4 1 0 0 14
(1) Let y = Ux, and solve Ly = b
1 0 0 y1 − 5 y1 = −5
0 1 0 y 2 = − 1 y = −1
2 − 4 1 y3 − 20 2
y3 = −20 − 2 y1 + 4 y2
= −20 − 2( −5) + 4( −1) = −14
Productivity Optimization Lab@NTU LA02_ Solving Linear Systems Chia-Yen Lee, Ph.D. 69
LU-Factorization
1
x=2
− 1
Productivity Optimization Lab@NTU LA02_ Solving Linear Systems Chia-Yen Lee, Ph.D. 70
LU-Factorization
0 1 2
Example: Find the LU factorization of 𝐴 = 1 1 1.
2 3 5
Sol:
(1)
We cannot do 𝑟21 because LU factorization required to perform only the
row operation of adding a multiple of one row to another row below it.
Productivity Optimization Lab@NTU LA02_ Solving Linear Systems Chia-Yen Lee, Ph.D. 71
LU-factorization
Productivity Optimization Lab@NTU LA02_ Solving Linear Systems Chia-Yen Lee, Ph.D. 72
Your Turn
1 1 1 1
1 2 3 4
Find an LU factorization of the matrix 𝐴 = .
1 3 6 10
1 4 10 20
Productivity Optimization Lab@NTU LA02_ Solving Linear Systems Chia-Yen Lee, Ph.D. 73
Keywords
Productivity
1.74 Optimization Lab@NTU LA02_ Solving Linear Systems Chia-Yen Lee, Ph.D.
Topics
Preliminaries
Echelon Form of a Matrix
Elementary Matrices
Finding A-1
Equivalent Matrices
LU-Factorization
Applications: Polynomial Curve Fitting
Applications: The Global Positioning System (GPS)
Applications: Linear Regression by Ordinary Least Squares
(OLS)
Productivity Optimization Lab@NTU LA02_ Solving Linear Systems Chia-Yen Lee, Ph.D. 75
Applications: Polynomial Curve Fitting
Polynomial Curve Fitting (Larson, 2016, p.26)
⚫ Given n points (x1, y1), (x2, y2),…, (xn, yn) on the xy-plane, find a
polynomial function of degree n–1 passing through these n points
Productivity
1.76 Optimization Lab@NTU LA02_ Solving Linear Systems Chia-Yen Lee, Ph.D.
The Global Positioning System (GPS)
Productivity Optimization Lab@NTU LA02_ Solving Linear Systems Chia-Yen Lee, Ph.D. 77
GPS
Z: through North Pole
To simplify the discussion, set the radius of
the earth to 1.
The surface of earth is:
𝑥2 + 𝑦2 + 𝑧2 = 1
Time unit: 1/100 seconds
x
Normalized Light speed = 0.47
y
⚫ Light speed: 299792458 m/s
⚫ Earth radius: 6371000 m
⚫ Normalized light speed: (L) / (E) = 47.05579 unit/s
Idea: three-dimensional triangulation.
Complication: We do not know the distance
to the satellites!
Productivity Optimization Lab@NTU LA02_ Solving Linear Systems Chia-Yen Lee, Ph.D. 78
GPS (Cont’d.)
Need very precise clock to be able to compute the distance from the receiver
to the satellites!
This type of clock is very expensive.
Need a cheap solution
→ Solve for distance using one additional satellite. Time for sending
signal from satellite
Satellite Position Time (from
midnight) 1/100 s
1 (1.11, 2.55, 2.14) 1.29
2 (2.87, 0.00, 1.43) 1.31
3 (0.00, 1.08, 2.29) 2.75
4 (1.54, 1.01, 1.23) 4.06
Productivity Optimization Lab@NTU LA02_ Solving Linear Systems Chia-Yen Lee, Ph.D. 79
GPS (Cont’d.)
Productivity Optimization Lab@NTU LA02_ Solving Linear Systems Chia-Yen Lee, Ph.D. 81
GPS (Cont’d.)
Productivity Optimization Lab@NTU LA02_ Solving Linear Systems Chia-Yen Lee, Ph.D. 82
GPS (Cont’d.)
Productivity Optimization Lab@NTU LA02_ Solving Linear Systems Chia-Yen Lee, Ph.D. 83
Ordinary Least Squares (OLS)
Ordinary Least Squares (OLS)
𝑝
⚫ Linear equation 𝑦𝑖 = 𝛽0 + σ𝑗=1 𝛽𝑗 𝑥𝑗𝑖 + 𝜀𝑖
─ 𝑦𝑖 : response variable
─ 𝑥𝑗𝑖 : independent variable
─ 𝛽0 : intercept
─ 𝛽𝑗 : coefficient of the independent variable (i.e. slope)
─ 𝜀: noise (or call residual)
⚫ The aim of this approach is to minimize the residual for all residuals
⚫ We square the residual before minimizing
⚫ We can derive the intercept and slope parameter using basic calculus
𝑝
y y 𝑦ො𝑖 = 𝛽መ0 + σ𝑗=1 𝛽መ𝑗 𝑥𝑗𝑖
𝜀
Induction
(歸納法)
x x
Productivity Optimization Lab@NTU LA02_ Solving Linear Systems Chia-Yen
84 Lee, Ph.D.
Ordinary Least Squares (OLS)
Minimizing Residual Sum of Squares (RSS)
𝑝
⚫ Minimize 𝑅𝑆𝑆 = σ𝑛𝑖=1 𝜀𝑖2 = σ𝑛𝑖=1(𝑦𝑖 − 𝑦ො𝑖 )2 = σ𝑛𝑖=1(𝑦𝑖 − (𝛽0 + σ𝑗=1 𝛽𝑗 𝑥𝑗𝑖 ))2
⚫ Minimize 𝑅𝑆𝑆 = (𝒀 − 𝑿𝜷)𝑇 (𝒀 − 𝑿𝜷)
⚫ This is a quadratic function with the p+1 variables. Differentiating with
respect to 𝛽, we obtain
𝜕𝑅𝑆𝑆
─ = −2𝑿𝑇 𝒀 − 𝑿𝜷
𝜕𝜷
𝜕2 𝑅𝑆𝑆
─ = 2𝑿𝑇 𝑿
𝜕𝜷𝜕𝜷𝑇
⚫ Assume that 𝑿 has full column rank, and hence 𝑿𝑇 𝑿 is positive definite,
we se the first derivative to zero.
─ 𝑿𝑇 𝒀 − 𝑿𝜷 = 𝟎
⚫ Then, we obtain the unique solution For 𝑦𝑖 = 𝛽0 + 𝛽1 𝑥1𝑖 + 𝛽2 𝑥2𝑖
𝛽2
𝛽0 𝛽1 𝛽2
Productivity Optimization Lab@NTU LA02_ Solving Linear Systems Chia-Yen Lee, Ph.D. 85
Animated Linear Algebra
Matrix multiplication as composition | Chapter 4, Essence of linear algebra (https://www.youtube.com/watch?v=XkY2DOUCWMU)
Productivity Optimization Lab@NTU LA02_ Solving Linear Systems Chia-Yen Lee, Ph.D. 86
Self-Practice Ch. 2
Productivity Optimization Lab@NTU LA02_ Solving Linear Systems Chia-Yen Lee, Ph.D. 87
Productivity Optimization Lab@NTU LA02_ Solving Linear Systems Chia-Yen Lee, Ph.D. 88