Thanks to visit codestin.com
Credit goes to www.scribd.com

0% found this document useful (0 votes)
60 views28 pages

Topic05 Stud

The document discusses inequality constraints in optimization problems. It defines active constraints as constraints satisfied as equalities at a feasible point. A regular point is a feasible point where the gradients of the active constraints are linearly independent. The Karush-Kuhn-Tucker (KKT) conditions provide first-order necessary conditions for a local minimizer that is a regular point, including complementary slackness conditions. A constrained stationary point satisfies the KKT conditions with some multiplier values. Examples illustrate these concepts for specific optimization problems involving inequality constraints.

Uploaded by

Bob
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
60 views28 pages

Topic05 Stud

The document discusses inequality constraints in optimization problems. It defines active constraints as constraints satisfied as equalities at a feasible point. A regular point is a feasible point where the gradients of the active constraints are linearly independent. The Karush-Kuhn-Tucker (KKT) conditions provide first-order necessary conditions for a local minimizer that is a regular point, including complementary slackness conditions. A constrained stationary point satisfies the KKT conditions with some multiplier values. Examples illustrate these concepts for specific optimization problems involving inequality constraints.

Uploaded by

Bob
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 28

Topic and contents

School of Mathematics and Statistics


MATH3161/MATH5165 Optimization
Prof Jeya Jeyakumar

Topic 05 Inequality Constraints

Inequality constraints
Active constraints
Regularity

First order conditions


KKT conditions
Complementarity
Constrained stationary point

MATH3161/MATH5165 (Optimization)

Second order conditions


Sufficient conditions
Sensitivity analysis
Constraint perturbations
Duality
Strong and weak duality
Wolfe Duality
Linear programming duality

H05 Inequality Constraints

Session 1, 2014

1 / 28

Inequality constraints

Problem considered
Basic problem

Minimize f (x)
x

Unconstrained problem: = Rn
Equality constrained problem:
= { x Rn : ci (x) = 0, i = 1, . . . , m }
Inequality constrained problem:
= {x Rn

: ci (x) = 0, i = 1, . . . , mE ,
ci (x) 0, i = mE + 1, . . . , m}

Notes
Set of equality constraints E = { 1, . . . , mE }
Set of inequality constraints I = { mE + 1, . . . , m }
Feasible point must satisfy all equality and all inequality constraints
Which inequality constraints are active at the solution?
MATH3161/MATH5165 (Optimization)

H05 Inequality Constraints

Session 1, 2014

2 / 28

Inequality constraints

Active constraints

Active constraints
Feasible region
= {x Rn :

ci (x) = 0, i E,

ci (x) 0, i I}

Definition (Active constraints)


The set of active constraints at a feasible point x is
A(x) = { i 1, . . . , m : ci (x) = 0 }
Constraints satisfied as equalities at a point x are active at x
At any feasible point x the equality constraints must be satisfied, so
A(x) = { 1, . . . , mE } { i mE + 1, . . . , m : ci (x) = 0 }
= E { i I : ci (x) = 0 }
Point x feasible, i 6 A(x) = ci (x) < 0
Point x in the interior of = A(x) =
MATH3161/MATH5165 (Optimization)

H05 Inequality Constraints

(= mE = 0)
Session 1, 2014

3 / 28

Inequality constraints

Active constraints

Active constraints Example


Example (Active constraints)
c2 (x) := x1 1 0,
= {x R2 : c1 (x) := x21 x2 0,
c3 (x) := x1 + x2 2 0, c4 (x) := x1 + x2 4 0}
Which points are feasible? If feasible, find the active constraints.
 
 
 


 
0
0
0
1
1
(1)
(2)
(3)
(4)
(5)
x =
,x =
, x =
,x =
,x =
3
1
0
1
3
1

Repeat when the first constraint becomes an equality c1 (x) = 0


4.5

4.5

3.5

3.5

(5)

(1)

(2)

2.5
x

1.5

1.5

(4)

(4)

(6)

(2)

0.5

(6)

0.5

x(3)

0.5
2

(5)

(1)

2.5

1.5

0.5

0
x

x(3)

0.5

1.5

0.5
2

1.5

MATH3161/MATH5165 (Optimization)

0.5

0
x

0.5

1.5

H05 Inequality Constraints

Session 1, 2014

4 / 28

Inequality constraints

Regularity

Regularity
Definition (Regular point)
A feasible point x is a regular point the gradients of the active
constraints ci (x), i A(x) are linearly independent
Example (Regular points)
= {x R2 : c1 (x) := x21 x2 0,
c3 (x) := x1 + x2 2 0,
1 Which of these points are regular points?
 
 

0
0
(2)
(3)
(4)
x =
, x =
,x =
1
0

c2 (x) := x1 1 0,
c4 (x) := x1 + x2 4 0}
1
1

(5)

,x

1
3

4.5

3.5

(1)

(2)

(5)

2.5
x2

1.5

(4)

(6)

0.5

x(3)

0.5
2

MATH3161/MATH5165 (Optimization)

1.5

0.5

0
x1

0.5

1.5

H05 Inequality Constraints

Session 1, 2014

5 / 28

First order conditions

KKT conditions

KKT conditions
Proposition (Karush-Kuhn-Tucker (KKT) conditions)
f , ci , i = 1, . . . , m continuously differentiable on . x a local minimizer
and a regular point = there exist multipliers i , i = 1, . . . , m:
m
X
i ci (x ) = 0
(1a)
x L(x , ) = f (x ) +
i=1

i = 1, . . . , mE

(1b)

i = mE + 1, . . . , m

(1c)

ci (x ) = 0,
ci (x ) 0,
i 0,
i ci (x )

i = mE + 1, . . . , m
= 0,

(1d)

i = 1, . . . , m

(1e)

KKT conditions hold at any local minimizer which is a regular point


Equations (1b) and (1c) represent the feasibility of x
Non-negativity (1d) only for inequality constraint multipliers
No sign restrictions on equality constraint multipliers
Equations (1e) are the complementary conditions
MATH3161/MATH5165 (Optimization)

H05 Inequality Constraints

Session 1, 2014

6 / 28

First order conditions

Complementarity

Complementarity
Definition (Complementarity conditions)
i ci (x ) = 0 i = 1, . . . , m
Inequality constraint i: x feasible, but not active = ci (x ) < 0
ci (x ) < 0, complementarity i ci (x ) = 0 = multiplier i = 0
Inactive constraints: multipliers zero = KKT conditions are
X
i ci (x ) = 0
(2a)
f (x ) +
iA

(2b)

ci (x ) 0 i = mE + 1, . . . , m

(2c)

i
i

0 i = mE + 1, . . . , m

(2d)

(2e)

ci (x ) = 0 i = 1, . . . , mE

MATH3161/MATH5165 (Optimization)

= 0 i 6 A

H05 Inequality Constraints

Session 1, 2014

7 / 28

First order conditions

Constrained stationary point

Constrained stationary point


Definition (Constrained stationary point)
A constrained stationary point x is a feasible point at which there exists
multipliers i , i A satisfying
X
f (x ) +
i ci (x ) = 0
iA

First order necessary conditions: x a local minimizer =


i 0 for all i A I
If i < 0 for an active inequality constraint i A I =
x is not a local minimizer (contrapositive)
First order necessary conditions: x a local maximizer =
i 0 for all i A I
If i > 0 for an active inequality constraint i A I =
x is not a local maximizer (contrapositive)
MATH3161/MATH5165 (Optimization)

H05 Inequality Constraints

Session 1, 2014

8 / 28

First order conditions

Constrained stationary point

Constrained stationary points Example


Example (Constrained stationary points Example)
f (x) = x21 + (x2 1)2 (x2 3)2 +
= {x R2

c1 (x) := x21 x2 0,
c2 (x) := x1 1 0,
c3 (x) := x1 + x2 2 0,
c4 (x) := x1 + x2 4 0}

so mE = 0, E = (no equality constraints),


1
2

Show that x = [ 0 2

]T

x2
2

m = 4, I = { 1, 2, 3, 4 }

is not a constrained stationary point

Show that x(3) = [ 0 0 ]T is a constrained stationary point, but not a


local minimizer

Find constrained stationary points, if they exist, with


a) A =
b) A = { 1, 3 }

How would you find all constrained stationary points?

MATH3161/MATH5165 (Optimization)

H05 Inequality Constraints

Session 1, 2014

9 / 28

First order conditions

Constrained stationary point

Constrained stationary points Gradients


Solution (Constrained stationary points Gradients)


2x1
f (x) =
4(x2 1)(x2 2)(x2 3) + 12


 


 
2x1
1
1
1
c1 (x) =
, c2 (x) =
, c3 (x) =
, c4 (x) =
1
0
1
1
x21 + (x 1)2 (x 3)2 + x /2
2

4.5

21
15
4

9
6

3.5

(5)

2.5

x2

1.5
1.75

2.5

2
1.75
1.5

1.5

x(4)

x(6)

0.5
0.75
1.25

0.5

x(3)

0.5
2

9
15
21

1.5

0.5

0.5

1.5

x1

MATH3161/MATH5165 (Optimization)

H05 Inequality Constraints

Session 1, 2014

10 / 28

First order conditions

Constrained stationary point

Constrained stationary points Solution Part 1


Solution (Constrained stationary points Part 1)
1

x = [ 0 2 ]T =
c(x) = [ 2 1 0 2 ]T 0 = x feasible
Active constraints: A(x) = { 3 }
i = 0, i = 1, 2, 4
Inactive constraints: ci (x) < 0, i = 1, 2, 4 =
Gradient of Lagrangian
X
i ci (x) = f (x) +
3 c3 (x) = 0

f (x) +
iA(x)

0
1
2

3
+

1
1

0
0

3 = x not a constrained stationary point


No solution for

MATH3161/MATH5165 (Optimization)

H05 Inequality Constraints

Session 1, 2014

11 / 28

First order conditions

Constrained stationary point

Constrained stationary points Solution Part 2


Solution (Constrained stationary points Part 2)
2

x(3) = [ 0 0 ]T =
c(x(3) ) = [ 0 1 2 4 ]T 0 = x feasible
Active constraints: A(x(3) ) = { 1 }
Inactive constraints: ci (x(3) ) < 0, i = 2, 3, 4 = i = 0, i = 2, 3, 4
Gradient of Lagrangian
X
f (x(3) ) +
i ci (x(3) ) = f (x(3) ) + 1 c1 (x(3) ) = 0
iA(x(3) )

0
47
2

+ 1

0
1

0
0

(3)

(3)
constrained stationary point
Solution exists with 1 = 47
2 = x
(3)
Constraint c1 is an inequality constraint, 1 < 0 =

x not a local minimizer

MATH3161/MATH5165 (Optimization)

H05 Inequality Constraints

Session 1, 2014

12 / 28

First order conditions

Constrained stationary point

Constrained stationary points Solution Part 3 a)


Solution (Constrained stationary points Part 3 a))
3 A = |A| = 0, no active constraints
Gradient of Lagrangian

2x1
f (x) =
4(x2 1)(x2 2)(x2 3) +

1
2

=0

f (x) = 0 = x1 = 0, 8(x2 1)(x2 2)(x2 3) + 1 = 0


Cubic = three solutions x2 = 0.94 . . . , 2.12 . . . , 2.93 . . . ( iqex1.m)
c3 (x) = x1 + x2 2 0 = second and third solutions not feasible
Check x = [ 0 0.94 . . . ]T is strictly feasible = i = 0, i = 1, 2, 3, 4
x12 + (x21)2 (x23)2 + x2/2

8 (x 1) (x 2) (x 3) + 1
2

21
15
4

40

9
6

30

3.5

4
3

x(5)

1.5

20

1.75

2.5

10

2.5

x2

50

4.5

2
1.75
1.5

1.5

(4)

10

(6)

0.5

20

0.75
1.25
0.5

0.5
2

30

(3)

40

15
21

50
1.5

0.5

0.5

1.5

0.5

x1

MATH3161/MATH5165 (Optimization)

H05 Inequality Constraints

1.5

2
x2

2.5

3.5

Session 1, 2014

13 / 28

First order conditions

Constrained stationary point

Constrained stationary points Solution Part 3 b)


Solution (Constrained stationary points Part 3 b))
3 A = { 1, 3 }
Active constraints c1 (x) = 0, c3 (x) = 0
x21

x2 = 0
x = x21
= 2 2
= b
x=
x1 + x2 2 = 0
x1 x1 2 = 0
At b
x=[2

4 ]T ,

c(b
x) = [ 0

1 0

2
4

(4)

,x

1
1

2 ]T = b
x is not feasible

At x(4) = [ 1 1 ]T , c(x(4) ) = [ 0 2

0 4 ]T = x(4) feasible

Gradient of Lagrangian at x(4)


f (x(4) ) + 1 c1 (x(4) ) + 3 c3 (x(4) ) =

2
1
2

(4)

+ 1

2
1

+ 3

1
1

=0

(4)

Solve linear system = 1 = 12 ,


3 = 1
(4)
(4)
(4)
(4)
Inactive constraints c2 (x ) < 0, c4 (x ) < 0 = 2 = 0, 4 = 0
x(4) a constrained stationary point
Negative multiplier for inequality constraint = not a local minimizer
MATH3161/MATH5165 (Optimization)

H05 Inequality Constraints

Session 1, 2014

14 / 28

First order conditions

Constrained stationary point

Constrained stationary points Solution Part 4


Solution (Constrained stationary points Part 4)
4 Find all constrained stationary points
Consider all possible active sets A { 1, 2, 3, 4 }
|A| = 0 = A =
|A| = 1 = A = { 1 }, A = { 2 }, A = { 3 }, A = { 4 }
|A| = 2 = A = { 1, 2 }, A = { 1, 3 }, A = { 1, 4 },
A = { 2, 3 }, A = { 2, 4 }, A = { 3, 4 }
|A| = 3 = A = { 1, 2, 3 }, A = { 1, 2, 4 },
A = { 1, 3, 4 }, A = { 2, 3, 4 }
|A| = 4 = A = { 1, 2, 3, 4 }
x21 + (x21)2 (x23)2 + x2/2
4.5

21
15
4

9
6

3.5

4
3

x(5)

1.5
1.75

2.5
2

2.5

2
1.75
1.5

1.5

(4)

(6)

0.5
0.75
1.25

0.5

x(3)

0.5
2

9
15
21

1.5

0.5

0.5

1.5

MATH3161/MATH5165 (Optimization)

H05 Inequality Constraints

Session 1, 2014

15 / 28

Second order conditions

Sufficient conditions

Second order sufficient conditions


Proposition (Second order sufficient conditions)
Let f, ci C 2 (), active constraints A = {i E I : ci (x ) = 0}
If there exists multipliers Rm such that
X
f (x ) +
i ci (x ) = 0

(3a)

iA

ci (x ) = 0

ci (x ) 0
i
i

i E,

(3b)

i I,

(3c)

(3d)

(3e)

= 0 i 6 A ,
0 i A I,

dT x2 L(x , )d > 0

F =

for all d F



dT c (x ) = 0 i E {i I : i > 0}
.
d Rn : T i
d ci (x ) 0 i {i A I : i = 0}

then x is a strict local minimizer of f on


MATH3161/MATH5165 (Optimization)

H05 Inequality Constraints

Session 1, 2014

16 / 28

Second order conditions

Sufficient conditions

Strict complementarity
Definition (Strict complementarity)
Constrained stationary point x , satisfies strict complementarity if
i > 0

for all i A I

Strict complementarity implies




F = d Rn : dT ci (x ) = 0 i A

Let x be a regular point and let t = |A |. Let Z Rnnt be a


full rank matrix satisfying Z T ci (x ) = 0, i A .
o
n

F = d Rn : d = Z v, v Rnt

Columns of Z are basis for tangent space to active constraints at x


t = n = F = { 0 } = Z does not exist

Reduced Hessian of Lagrangian


WZ = Z T 2x L(x , )Z Rnt

nt

MATH3161/MATH5165 (Optimization)

H05 Inequality Constraints

Session 1, 2014

17 / 28

Second order conditions

Sufficient conditions

Second order sufficiency


Proposition (Second order sufficiency)
Let x , be a regular constrained stationary point
1

If i > 0 for all i I A and WZ positive definite then x is a


strict local minimizer
If i < 0 for all i I A and WZ negative definite then x is a
strict local maximizer
If any of the following conditions are satisfied
there exist i, j I A such that i > 0 and j < 0, or
there exist a j I A such that j > 0 and WZ has a negative
eigenvalue, or
there exist a j I A such that j < 0 and WZ has a positive
eigenvalue, or
WZ is indefinite

then x is neither a local minimizer nor a local maximizer

MATH3161/MATH5165 (Optimization)

H05 Inequality Constraints

Session 1, 2014

18 / 28

Second order conditions

Sufficient conditions

Second order sufficiency Example


Example (Second order sufficiency Example)
f (x) = x21 + (x2 1)2 (x2 3)2 +
= {x R2

x2
2

c1 (x) := x21 x2 0,
c2 (x) := x1 1 0,
c3 (x) := x1 + x2 2 0,
c4 (x) := x1 + x2 4 0}

Show that x(3) = [ 0 0 ]T is a strict local maximizer using second


order sufficiency conditions

Solution
c(x(3) ) = [ 0 1 2 4 ]T 0 = x feasible
Active constraints:

A(x(3) ) = { 1 }
(3)

Constrained stationary point: Multiplier 1 = 47


2
MATH3161/MATH5165 (Optimization)

H05 Inequality Constraints

Session 1, 2014

19 / 28

Second order conditions

Sufficient conditions

Second order sufficiency Example cont.


Gradients and Hessians of objective and active constraint



2x1
f (x) =
,
c1 (x) =
4(x32 6x22 + 11x2 6) + 12



2
0
2
2
2
f (x) =
,
c1 (x) =
2
0 4(3x2 12x2 + 11)
0

2x1
1
0
0

(3)

Hessian of Lagrangian at x(3) , 1 = 47


2

 



47 2 0
45 0
2 0
=
x2 L(x(3) , (3) ) =
+
0 0
0
44
0 44
2
Basis for tangent space to active constraints at x(3)
t = |A(x(3) )| = 1,
nt=21 =1


 
0
1
(3)
(3)
c1 (x ) =
= Z =
= WZ = 45
1
0
(3)

(3)

1 < 0 and WZ negative definite = x(3) strict local maximizer


MATH3161/MATH5165 (Optimization)

H05 Inequality Constraints

Session 1, 2014

20 / 28

Sensitivity analysis

Constraint perturbations

Sensitivity to constraint perturbations


Sensitivity results require the solution x is nice:
1 Linear independence: c (x ), i A(x ) linearly independent
i
2 Strict complementarity: Multipliers > 0 for i A(x ) I
i
3 Second order sufficiency: dT 2 L(x , )d > 0 for all d F , d 6= 0
x
Proposition (Constraint perturbations)
If constraint ci (x) 0 becomes ci (x) + i 0
Change in the optimal objective is f i i
Estimate of the new optimal objective value is
f (i ) = f (x ) + f f (x ) + i i
Lagrange multiplier i gives first order estimate of rate of change of
objective function with respect to changes in ith constraint.
Inequality constraint i not active at x , i 6 A , then i = 0 =
constraint i can be perturbed without affecting optimal solution
MATH3161/MATH5165 (Optimization)

H05 Inequality Constraints

Session 1, 2014

21 / 28

Sensitivity analysis

Constraint perturbations

Sensitivity Example
Example (Sensitivity Example)
Replace c1 (x) = x21 x2 0 by c1 (x) = x21 x2 + 1 0. When 1 = 0.1,
estimate objective value change f for local maximizer x(1 ): x(0) = x(3)
Solution
(3)
At x(3) = [ 0 0 ]T , f (3) = 9 and 1 = 47
2
(3)

First order estimate f 1 1 = 47


2 0.1 = 2.35 (true 2.1379)
2

Perturbing constraint: c1(x) = x1 x2 + 1 0


30
Perturbed optimal value
Optimal value at 0
Linear approximation
25

Optimal value

20

15

10

5
0.5

MATH3161/MATH5165 (Optimization)

0.4

0.3

0.2

0.1

0
0.1
Perturbation 1

0.2

0.3

H05 Inequality Constraints

0.4

0.5

Session 1, 2014

22 / 28

Sensitivity analysis

Constraint perturbations

Convex Programming
Proposition(KKT Sufficiency for global optimality). Let x be a
feasible point. Let, for each i = 1, . . . , mE , ci be affine (i.e. both
convex and concave), for each i = mE + 1, . . . , m, ci be convex, and
f be convex on . Assume that KKT conditions (1a)(1e) hold.
Then, x is a global minimizer.
Note: If, for each i = 1, . . . , mE , ci is affine (i.e. both convex and
concave), for each i = mE + 1, . . . , m, ci is convex, and f is convex
on then the optimization problem is a convex programming
problem.
Proposition: If the problem is a convex programming problem, then
any constrained stationary point x with i 0 for all i A I is a
global minimzer.

MATH3161/MATH5165 (Optimization)

H05 Inequality Constraints

Session 1, 2014

23 / 28

Duality

Strong and weak duality

Duality in convex programming


Primal problem
Minimize f (x)
x Rn
Subject to ci (x) = 0,
ci (x) 0,

(4)

i E,
i I.

Primal feasible region


= { x Rn : ci (x) = 0, i E,

ci (x) 0, i I }

Lagrange function: Rn Rm R
L(x, ) = f (x) +

m
X

i ci (x) = f (x) + T c(x)

i=1

KKT conditions include x L(x , ) = 0


MATH3161/MATH5165 (Optimization)

H05 Inequality Constraints

Session 1, 2014

24 / 28

Duality

Strong and weak duality

Dual problem
Wolfe Dual problem
Maximize
L(y, )
n
m
y R , R
Subject to
y L(y, ) = 0,
i 0, i I,

(5)

Dual variables Rm
Inequality constraints = i 0, i I
Proposition (Weak duality): Let the problem (4) be a convex
programming problem. If x is a feasible point for (4) and (y, ) is a
feasible point of the dual problem (5) then
f (x) L(y, ).
In particular, the minimum objective value of (4) is bigger than or
equal to the maximum objective value of the dual problem.
MATH3161/MATH5165 (Optimization)

H05 Inequality Constraints

Session 1, 2014

25 / 28

Duality

Wolfe Duality

Strong Duality
Proposition (Wolfe Dual)
Convex primal problem (4) with continuously differentiable f , ci , i E I.
If x solves (4) and a regularity condition holds, then x , solves the
dual problem
Maximize
L(x, )
n
m
x R , R
Subject to
x L(x, ) = 0,
i 0, i I,
f (x ) = L(x , ), minimum value of primal = maximum value of dual
Convexity of the primal problem is critical
f , ci , i I convex and ci , i E affine =
convex set

Use constraints x L(x, ) = 0 to eliminate variables x from dual


MATH3161/MATH5165 (Optimization)

H05 Inequality Constraints

Session 1, 2014

26 / 28

Duality

Linear programming duality

Linear programming duality


Example (Linear programming duality)
Consider the (primal) linear programming problem
Minimize gT x
x Rn
Subject to ATE x + bE = 0,
ATI x + bI 0.
Show the dual is

where
=

Maximize bT
Rm
Subject to A + g = 0,
I 0,


E
I

MATH3161/MATH5165 (Optimization)

b=

bE
bI

A=

H05 Inequality Constraints

AE

AI

Session 1, 2014

27 / 28

Duality

Linear programming duality

Linear programming dual Solution


Solution (Linear programming
dual)


Feasible region = x Rn : ATE x + bE = 0, ATI x + bI 0
All constraints affine = convex
Objective function f (x) = gT x linear = f convex


Dual variables Rm , T = TE TI , I 0

Lagrangian
L(x, ) = gT x + TE (ATE x + bE ) + TI (ATI x + bI )
Lagrangian gradient
x L(x, ) = g + AE E + AI I = g + A = 0
Lagrangian
L(x, ) = gT x + TE (ATE x + bE ) + TI (ATI x + bI )
= (g + AE E + AI I )T x + TE bE + TI bI
= T b
Wolfe dual then gives the desired dual linear program
MATH3161/MATH5165 (Optimization)

H05 Inequality Constraints

Session 1, 2014

28 / 28

You might also like