Thanks to visit codestin.com
Credit goes to www.scribd.com

0% found this document useful (0 votes)
54 views26 pages

Tut 2 - FromStats2DM - Linear Algebra and Convex Optimzation

The document provides tutorials on topics in linear algebra and convex optimization: - It demonstrates matrix multiplication and determinants, showing that AB is not always equal to BA. - Exercises are presented on reducing matrices to row echelon form and finding bases for row and column spaces. - Convex optimization problems are formulated using Lagrange multipliers to find critical points that optimize objective functions subject to constraints.

Uploaded by

Tareq Nushrat
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
54 views26 pages

Tut 2 - FromStats2DM - Linear Algebra and Convex Optimzation

The document provides tutorials on topics in linear algebra and convex optimization: - It demonstrates matrix multiplication and determinants, showing that AB is not always equal to BA. - Exercises are presented on reducing matrices to row echelon form and finding bases for row and column spaces. - Convex optimization problems are formulated using Lagrange multipliers to find critical points that optimize objective functions subject to constraints.

Uploaded by

Tareq Nushrat
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 26

From Statistics

to Data Mining
Master 1
Computational Colour and Spectral Imaging (COSI)
Cyber‐Physical Social System (CPS2)
Saint-Étienne, France

Fabrice MUHLENBACH
https://perso.univ-st-etienne.fr/muhlfabr/
e-mail: [email protected] 1
Tutorial

• Linear Algebra → Matrix Multiplication


3 −6 2 4
➢ compute the matrix product 𝐴𝐵 with 𝐴 = and 𝐵 =
−2 4 1 2

➢ then compute the matrix product 𝐵𝐴

➢ what can you conclude?

From Statistics 2 DM –Tutorial F. Muhlenbach 2


Tutorial

• Linear Algebra → Matrix Multiplication


➢ results:
0 0
𝐴𝐵 =
0 0
➢ but
−2 4
𝐵𝐴 =
−1 2

➢ we can observe that 𝐴𝐵 ≠ 𝐵𝐴


which is a result that is most often encountered

From Statistics 2 DM –Tutorial F. Muhlenbach 3


Tutorial

• Linear Algebra → Matrix Determinant


➢ compute the matrix determinant of the (3 × 3) square matrix
3 2 −1
𝐶 = −1 2 3
1 1 1

From Statistics 2 DM –Tutorial F. Muhlenbach 4


Tutorial

• Linear Algebra → Matrix Determinant


➢ Solution 1: we extend the matrix with the 2 first columns:
3 2 −1 3 2
𝐶 = −1 2 3 −1 2
1 1 1 1 1
➢ we compute the positive sum of the products in NW-SE diag. and the
negative sum of the products in SW-NE diag.
➢ the result gives:
3 × 2 × 1 + 2 × 3 × 1 + −1 × −1 × 1
− −1 × 2 × 1 − 3 × 3 × 1 − (2 × (−1) × 1)
=6+6+1+2−9+2
=8
From Statistics 2 DM –Tutorial F. Muhlenbach 5
Tutorial

• Linear Algebra → Matrix Determinant


➢ Solution 2: we make the sum or subtraction (alternatively) of the
terms of the 1st row of the 3 × 3 -matrix 𝐶 with the determinants of
the (2 × 2)-square matrices extracted from 𝐶
3 2 −1
𝐶 = −1 2 3
1 1 1
2 3 −1 3 −1 2
➢ 3 × 𝑑𝑒𝑡 − 2 × 𝑑𝑒𝑡 + (−1) × 𝑑𝑒𝑡
1 1 1 1 1 1
➢ = 3× 2×1−3×1 −2× −1 × 1 − 3 × 1 − 1 × −1 × 1 − 2 × 1
➢ = 3 × −1 − 2 × −4 − 1 × −3
➢ = −3 + 8 + 3 = 8

From Statistics 2 DM –Tutorial F. Muhlenbach 6


Tutorial

• Linear Algebra → Reduced row echelon form


1 −2 5 0 3
➢ the matrix 𝑅 = 0 1 3 0 0 is in row-echelon form
0 0 0 1 0
0 0 0 0 0
𝑟1 = 1 − 2 5 0 3
➢ The vectors ൞ 𝑟2 = 0 1 3 0 0 form a basis for the row space of 𝑅,
𝑟3 = 0 0 0 1 0
1 −1 0
and the vectors 𝑐1 = 0 , 𝑐2 = 1 , 𝑐4 = 0
0 0 1
0 0 0
form a basis for the column space of 𝑅

From Statistics 2 DM –Tutorial F. Muhlenbach 7


Tutorial

• Linear Algebra → Reduced row echelon form


➢ exercise: reduce the matrix 𝐴 to row-echelon form
(first step for finding the bases for the row and column spaces)
 1 −3 4 − 2 5 4 
 2 −6 9 −1 8 2 
A= 
 2 −6 9 −1 9 7 
 
 −1 3 −4 2 −5 −4 
➢ solution: reduction 𝐴 to row-echelon form
1 −3 4 −2 5 4
0 0 1 3 −2 −6 
R=
0 0 0 0 1 5
 
0 0 0 0 0 0

From Statistics 2 DM –Tutorial F. Muhlenbach 8


Tutorial

• Linear Algebra → Reduced row echelon form


➢ exercise: find a basis for the space spanned by the vectors
𝑣1 = (1, −2, 0, 0, 3), 𝑣2 = (2, −5, −3, −2, 6),
𝑣3 = (0, 5, 15, 10, 0), 𝑣4 = (2, 6, 18, 8, 6).
➢ solution: write down the vectors as row vectors first
 1 −2 0 0 3  1 −2 0 0 3
 2 −5 −3 −2 6  0 1
  3 2 0 
 0 5 15 10 0 0 0 1 1 0
   
 2 6 18 8 6
0 0 0 0 0
➢ the nonzero row vectors in this matrix are
𝑤1 = (1, −2, 0, 0, 3), 𝑤2 = (0, 1, 3, 2, 0), 𝑤3 = (0, 0, 1, 1, 0)

From Statistics 2 DM –Tutorial F. Muhlenbach 9


Tutorial

• Linear Algebra → Reduced row echelon form


➢ keeping in mind that 𝐴 and 𝑅 may have different column spaces, we
cannot find a basis for the column space of 𝐴 directly from the
column vectors of 𝑅
➢ however, if we can find a set of column vectors of 𝑅 that forms a
basis for the column space of 𝑅, then the corresponding column
vectors of 𝐴 will form a basis for the column space of 𝐴
➢ in the previous example, the basis vectors obtained for the column
space of 𝐴 consisted of column vectors of 𝐴, but the basis vectors
obtained for the row space of 𝐴 were not all vectors of 𝐴
→ transposition of the matrix

From Statistics 2 DM –Tutorial F. Muhlenbach 10


Tutorial

• Linear Algebra → Reduced row echelon form


1 −2 0 0 3
➢ find a basis for the row space of  2 −5 −3 −2 6 
A=
 0 5 15 10 0
 
 2 6 18 8 6
consisting entirely of row vectors from 𝐴
➢ solution:  1 2 0 2 1 2 0 2 
 −2 −5 5 6  0 1 5 −10 
  
A =  0 −3 15 18
T
0 0 0 1 
   
 0 −2 10 8  0 0 0 0 
 3 6 0 6  0 0 0 0 
➢ the nonzero vectors in this matrix are
𝑤1 = (1, 2, 0, 2), 𝑤2 = (0, 1, 5, −10) and 𝑤3 = (0, 0, 0, 1)
From Statistics 2 DM –Tutorial F. Muhlenbach 11
Tutorial

• Convex Optimization
➢ Exercise 1: we want to solve:
max 𝑥. 𝑦
ቊs. t. 𝑥 + 3𝑦 = 24

➢ Questions:
o what are the objective and constraint functions?
o solve this system by using the method of Lagrange multipliers

From Statistics 2 DM –Tutorial F. Muhlenbach 12


Tutorial

• Convex Optimization
➢ Solution:
max 𝑥. 𝑦
ቊs. t. 𝑥 + 3𝑦 = 24

o objective function: max 𝑥. 𝑦


o constraint function: 𝑥 + 3𝑦 = 24 ⟺ 𝑥 + 3𝑦 − 24 = 0

o method of Lagrange multipliers:


→ ℒ 𝑥, 𝑦, 𝜆 = objective function to optimize − 𝜆 constraint
o Lagrange function to optimize:
ℒ 𝑥, 𝑦, 𝜆 = 𝑥. 𝑦 − 𝜆 𝑥 + 3𝑦 − 24 = 𝑥. 𝑦 − 𝜆𝑥 − 3𝜆𝑦 + 24𝜆
From Statistics 2 DM –Tutorial F. Muhlenbach 13
Tutorial

• Convex Optimization
o ℒ 𝑥, 𝑦, 𝜆 = 𝑥. 𝑦 − 𝜆 𝑥 + 3𝑦 − 24 = 𝑥. 𝑦 − 𝜆𝑥 − 3𝜆𝑦 + 24𝜆
o optimizing a function → finding the critical points
→ finding when the derivative of the function is equal to zero
o we have 3 variables: 𝑥, 𝑦, 𝜆, therefore the function need to be derived
3 times
o there are 3 first order conditions:
𝜕ℒ
o =0⇔𝑦−𝜆 =0⇔𝜆 =𝑦 1
𝜕𝑥
𝜕ℒ 𝑥
o = 0 ⇔ 𝑥 − 3𝜆 = 0 ⇔ 𝜆 = 2
𝜕𝑦 3
𝜕ℒ
o = 0 ⇔ 𝑥 + 3𝑦 − 24 = 0 3
𝜕𝜆
From Statistics 2 DM –Tutorial F. Muhlenbach 14
Tutorial

• Convex Optimization
o ℒ 𝑥, 𝑦, 𝜆 = 𝑥. 𝑦 − 𝜆 𝑥 + 3𝑦 − 24 = 𝑥. 𝑦 − 𝜆𝑥 − 3𝜆𝑦 + 24𝜆
𝑥
o 1 and 2 : 𝑦 = 4
3
𝑥
o 3 and 4 : 𝑥 + 3 × − 24 = 0
3
o ⇔ 𝑥 + 𝑥 − 24 = 0
o ⇔ 2𝑥 = 24
o ⇔ 𝑥 = 12
𝑥 12
o with 4 : 𝑦 = = =4
3 3 𝑥 = 12
o with 1 : 𝜆 = 𝑦 = 4 therefore ቐ 𝑦 = 4
𝜆=4
From Statistics 2 DM –Tutorial F. Muhlenbach 15
Tutorial

• Convex Optimization
➢ Exercise 2: we want to solve:
min 2𝑥 + 2𝑦

s. t. 𝑥. 𝑦 = 4
➢ Solution: → constraint: 𝑥. 𝑦 = 4 ⟺ 𝑥. 𝑦 − 4 = 0
o ℒ 𝑥, 𝑦, 𝜆 = 2𝑥 + 2𝑦 − 𝜆 𝑥. 𝑦 − 4 . FOC:
𝜕ℒ 2
o = 0 ⇔ 2 − 𝜆𝑦 = 0 ⇔ 2 = 𝜆𝑦 ⇔ 𝜆 = 1
𝜕𝑥 𝑦
𝜕ℒ 2
o = 0 ⇔ 2 − 𝜆𝑥 = 0 ⇔ 2 = 𝜆𝑥 ⇔ 𝜆 = 2
𝜕𝑦 𝑥
𝜕ℒ
o = 0 ⇔ 𝑥. 𝑦 − 4 = 0 3
𝜕𝜆

From Statistics 2 DM –Tutorial F. Muhlenbach 16


Tutorial

• Convex Optimization
o ℒ 𝑥, 𝑦, 𝜆 = 2𝑥 + 2𝑦 − 𝜆 𝑥. 𝑦 − 4
2 2
o 1 and 2 : = ⇔𝑥=𝑦 4
𝑦 𝑥
o 3 and 4 : 𝑥 2 − 4 = 0 ⇔ 𝑥 2 = 4 ⟺ 𝑥 = 2 5
o 4 and 5 : 𝑦 = 2 and 𝜆 = 1
𝑥=2
o therefore ቐ𝑦 = 2
𝜆=1

From Statistics 2 DM –Tutorial F. Muhlenbach 17


Tutorial

• Convex Optimization
➢ Exercise 3: we want to solve:
𝑓 𝑥, 𝑦 = max 𝑥 2 𝑦

s. t. 𝑥 2 + 𝑦 2 = 1

o what is the graphical interpretation of 𝑥 2 + 𝑦 2 = 1?


o what is the graphical interpretation of 𝑥 2 𝑦?
o therefore, what is the graphical interpretation of
𝑓 𝑥, 𝑦 = max 𝑥 2 𝑦
൝ 2 2 ?
s. t. 𝑥 + 𝑦 = 1

From Statistics 2 DM –Tutorial F. Muhlenbach 18


Tutorial

• Convex Optimization
➢ Solution: we want to solve:
𝑓 𝑥, 𝑦 = max 𝑥 2 𝑦

s. t. 𝑥 2 + 𝑦 2 = 1
o graphical interpretation of 𝑥 2 + 𝑦 2 = 1:
𝑦 ❑ 𝑥2 + 𝑦2 = 1
❑ value for 𝑥: 𝑥𝐴
❑ ∀𝐴, 𝑥𝐴2 + 𝑦𝐴2 = 𝑥𝐴 , 𝑦𝐴 2 = 1
❑ value for 𝑦: 𝑦𝐴 ❑ 𝑥𝐴 , 𝑦𝐴 = 1
2 2 2
❑ 𝑥𝐴 + 𝑦𝐴 = 𝑥𝐴 , 𝑦𝐴 𝑦𝐴 𝐴 ⇔ 𝑂, 𝐴 = 1
❑ 𝑥𝐴 , 𝑦𝐴 = [𝑂, 𝐴] → circle with
𝑥 radius with the
𝑂 𝑥𝐴 size of 1
From Statistics 2 DM –Tutorial F. Muhlenbach 19
Tutorial

• Convex Optimization
o graphical interpretation of 𝑥 2 + 𝑦 2 = 1: circle with radius =1
o 𝑥2 + 𝑦2 = 1 ⟺ 𝑦2 = 1 − 𝑥2
o ⟺ 𝑦 = 1 − 𝑥2 or 𝑦 = − 1 − 𝑥 2

From Statistics 2 DM –Tutorial F. Muhlenbach 20


Tutorial

• Convex Optimization
1
o graphical interpretation of 𝑥 2 𝑦: 𝑦 = 𝑧 ×
𝑥2

𝑧=3
𝑧=2
𝑧=1

𝑧=0

𝑧 = −1

𝑧 = −2

From Statistics 2 DM –Tutorial F. Muhlenbach 21


Tutorial

• Convex Optimization
1
𝑓 𝑥, 𝑦 = max 𝑥 2 𝑦 ⇒ 𝑦 = 𝑧 ×
o graphical interpretation of ቐ 𝑥2
s. t. 𝑥 2 + 𝑦 2 = 1 ⇒ 𝑦 = ± 1 −𝑥2
o when 𝑓 𝑥, 𝑦 = 𝑧 ≤ 0.3, the
𝑧 = 1.0
constraint intersects with the
𝑧 = 0.4
𝑧 = 0.3 circle, some solutions for the 2
𝑧 = 0.2
𝑧 = 0.1 conditions exists
𝑧=0
o when 𝑓 𝑥, 𝑦 = 𝑧 ≥ 0.4, no
intersection → no solution
o maxima → tangency points
o 𝑧 ? graphically, 𝑧 ∈ 0.3; 0.4
From Statistics 2 DM –Tutorial F. Muhlenbach 22
Tutorial

• Convex Optimization
➢ Solution: we want to solve:
𝑓 𝑥, 𝑦 = max 𝑥 2 𝑦

s. t. 𝑥 2 + 𝑦 2 = 1
➢ constraint: 𝑥 2 + 𝑦 2 = 1 ⟺ 𝑥 2 + 𝑦 2 − 1 = 0
o ℒ 𝑥, 𝑦, 𝜆 = 𝑥 2 𝑦 − 𝜆 𝑥 2 + 𝑦 2 − 1 . FOC:
𝜕ℒ
o = 0 ⇔ 2𝑥. 𝑦 − 𝜆2𝑥 = 0 ⇔ 2𝑥. 𝑦 = 𝜆2𝑥 ⇔ 𝑥 ≠ 0, 𝜆 = 𝑦 1
𝜕𝑥
𝜕ℒ
o = 0 ⇔ 𝑥 2 − 𝜆2𝑦 = 0 ⇔ 𝑥 2 = 𝜆2𝑦 ⇔ 1 ⇒ 𝑥 2 = 2𝑦 2 2
𝜕𝑦
𝜕ℒ
o = 0 ⇔ 𝑥2 + 𝑦2 − 1 = 0 3
𝜕𝜆

From Statistics 2 DM –Tutorial F. Muhlenbach 23


Tutorial

• Convex Optimization
o ℒ 𝑥, 𝑦, 𝜆 = 𝑥 2 𝑦 − 𝜆 𝑥 2 + 𝑦 2 − 1
o 3 and 2 : 2𝑦 2 + 𝑦 2 − 1 = 0 ⇔ 3𝑦 2 = 1
1 1
o ⇔ 𝑦2 = ⇒𝑦=± 4
3 3

1 2 2
o 2 and 4 : 𝑥 2 = 2𝑦 2 = 2 × ⇔ 𝑥 2 = ⇒ 𝑥 = ± 5
3 3 3

1
o 1 and 4 : 𝑦 = 𝜆 ⇒ 𝜆 = ±
3

2 1 2 1 2 1 2 1
o solutions: , , ,− , − , and − ,−
3 3 3 3 3 3 3 3

From Statistics 2 DM –Tutorial F. Muhlenbach 24


Tutorial

• Convex Optimization
o ℒ 𝑥, 𝑦, 𝜆 = 𝑥 2 𝑦 − 𝜆 𝑥 2 + 𝑦 2 − 1
2 1 2 1 2 1 2 1
o solutions: , , ,− , − , and − ,−
3 3 3 3 3 3 3 3

o → the possible solutions are the four points where the contour lines
are tangent
o which one maximize the function 𝑓 𝑥, 𝑦 = max 𝑥 2 𝑦
o but 𝑦 cannot be negative because 𝑥 2 𝑦 will be negative
2 1 2 1 1 2
o 𝑓 , =𝑓 − , = → 2 solutions
3 3 3 3 3 3

o 𝑧 = 𝑥 2 𝑦 = 2Τ3 1Τ3 ≅ 0.3849 (reminder: 𝑧 ∈ 0.3; 0.4 )


From Statistics 2 DM –Tutorial F. Muhlenbach 25
Tutorial

• Convex Optimization
o function to
maximize:
1
𝑧 = 2Τ3 1Τ3
1
→𝑦=𝑧× 2
𝑥

-1
o constraint:
0 1
𝑦 = ± 1 − 𝑥2
o solutions:
2 1 2 1
, , − ,
3 3 3 3

From Statistics 2 DM –Tutorial F. Muhlenbach 26

You might also like