Thanks to visit codestin.com
Credit goes to www.scribd.com

0% found this document useful (0 votes)
78 views2 pages

Exercise1 Q

This document contains 7 multi-part exercises involving optimization concepts such as descent directions, line searches, Newton's method, and conjugate gradients. The exercises involve showing properties of functions, determining step sizes that satisfy line search conditions, proving sequences are bounded or converge to stationary points, and deriving solutions to optimization subproblems.

Uploaded by

Bin Christian
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
78 views2 pages

Exercise1 Q

This document contains 7 multi-part exercises involving optimization concepts such as descent directions, line searches, Newton's method, and conjugate gradients. The exercises involve showing properties of functions, determining step sizes that satisfy line search conditions, proving sequences are bounded or converge to stationary points, and deriving solutions to optimization subproblems.

Uploaded by

Bin Christian
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 2

AMA 615 Exercise 1 2nd Sem, 2021 – 2022

Exercises.
1. Let φ ∈ C 2 (IRn ) and suppose that there exists L > 0 such that k∇2 φ(y)k2 ≤ L for all y ∈ IRn . Show
that
k∇φ(u) − ∇φ(v)k2 ≤ Lku − vk2 ∀u, v ∈ IRn .
T T
2. Consider the function f (x) = (x1 + x22 )2 , the point x∗ = 1 and the direction d∗ = −1
 
0 1 .
Show that d∗ is a descent direction of f at x∗ , and find all stepsizes that satisfy the exact line search
criterion at x∗ along d∗ .
3. Consider the function f (x) = e−x . Consider an iterate of the following form

xk+1 = xk + αk dk ,

where dk = −f 0 (xk ) and αk is obtained via Armijo line search by backtracking with ᾱk ≡ 1 and
σ = 0.1. Start with x0 = 0.
(a) Show that e−y ≤ 1 − 0.1y whenever y ∈ [0, 1].
(b) Show that α0 = 1 and x1 = 1.
(c) Show that, for all k ≥ 0, it holds that xk+1 > 0 and αk = 1.
4. Let Q  0 and b ∈ IRn . Define
1 T
f (x) = x Qx − bT x.
2
(a) Suppose that x̄ is not a stationary point of f and suppose the steepest descent method with
exact line search is applied to minimizing f starting from x̄.
Show that the stepsize that satisfies the exact line search criterion at x̄ along −∇f (x̄) is given by

k∇f (x̄)k2
.
[∇f (x̄)]T Q∇f (x̄)

(b) Suppose that x∗ is the unique minimizer of f and let v be any eigenvector of Q.
i. Let x0 = x∗ + v. Show that x0 is not a stationary point of f .
ii. Show that the steepest descent method with exact line search initialized at x0 = x∗ + v
gives x1 = x∗ .
5. Consider the function f : IR2 → IR defined by
1
f (x) = (x1 + x2 − 1)2+ + kxk22 ,
2
where (
t if t ≥ 0,
t+ :=
0 otherwise.
(a) Compute ∇f (x).
(b) Consider an iterate of the following form:

xk+1 = xk + αk dk .

Let dk be the steepest descent direction and αk be chosen to satisfy the Wolfe’s condition.
Suppose that xk is nonstationary for all k.
i. Show that the sequence {xk } is bounded.
ii. Show that any accumulation point of {xk } is stationary.
Pm
6. Let f (x) = h(Ax) + µkxk2 , where h(y) = i=1 ln(1 + e−yi ), A ∈ IRm×n , and µ > 0.
(a) By considering k∇2 h(y)k2 and using Question 1, show that for any u and v ∈ IRm , it holds that
1
k∇h(u) − ∇h(v)k2 ≤ ku − vk2 .
4
AMA 615 Exercise 1

(b) Show that at any nonstationary point, the Newton direction −[∇2 f (x)]−1 ∇f (x) is a descent
direction.
(c) Consider an iterate of the following form:

xk+1 = xk + αk dk .

Let dk be the Newton direction and αk be chosen to satisfy the Wolfe’s condition. Suppose that
xk is nonstationary for all k.
i. Show that the sequence {xk } is bounded.
ii. Show that any accumulation point of {xk } is stationary.

7. Consider Minimize
n
f (x) with
x∈IR
1 T
f (x) = x Ax + bT x,
2
where A = uuT + I for some u ∈ IRn \{0}, and b ∈ IRn .
(a) Show that A  0.
(b) Argue that the eigenvalues of A are 1 (with multiplicity n − 1) and 1 + kuk22 .
(c) Suppose that the conjugate gradient method is applied to the above function, with x0 = 0.
Argue that x2 = −A−1 b.

Page 2

You might also like