Thanks to visit codestin.com
Credit goes to www.scribd.com

0% found this document useful (0 votes)
5 views9 pages

L6 Optimality Criterian - Constrained Problem

The document discusses constrained optimization problems, detailing the formulation of such problems, including objective functions and constraints. It introduces the Kuhn-Tucker conditions as necessary criteria for optimality in constrained optimization and provides examples to illustrate the identification of Kuhn-Tucker points. Additionally, it presents the Kuhn-Tucker necessity and sufficiency theorems, emphasizing the conditions under which solutions are optimal.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
5 views9 pages

L6 Optimality Criterian - Constrained Problem

The document discusses constrained optimization problems, detailing the formulation of such problems, including objective functions and constraints. It introduces the Kuhn-Tucker conditions as necessary criteria for optimality in constrained optimization and provides examples to illustrate the identification of Kuhn-Tucker points. Additionally, it presents the Kuhn-Tucker necessity and sufficiency theorems, emphasizing the conditions under which solutions are optimal.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 9

21MEE306 Optimization Techniques

Class Notes (Internal Circulation)

6. Optimality Criterion (Constrained Optimization)

6.1 Constrained Optimization problem

An optimization or a mathematical programming problem can be stated as follows:

{}
x1
x
Find, X = 2 which minimizes f ( X ) (Eq.6.1)

xn

subject to the following constraints:


g j ( x ) ≥0 , Where , j=1 ,2 , … , J (Eq.6.2)
h k ( x )=0 , Where ,k =1 , 2, … , K (Eq.6.3)
and
lb≤ x i ≤ ub, where, ‘lb’ is the lower bound, ‘ub’ is the upper bounds for the decision variables; and i=1 , 2 ,… , n
(Eq.6.4)
where
 ‘x’ is an ‘n’- dimensional vector called the design vector,

Page 1 of 9
 f (x) is termed the objective function, and
 g j ( x ) and h k ( x ) are known as ‘inequality’ and ‘equality constraints’, respectively.
 The number of variables ‘n’ and the number of constraints ‘J’ and/or ‘K’ need not be related in any way.
 Equation (1.3) represents the bounds for the design variable.
The problem stated in Eq. (1.1-1.4) is called a ‘constrained optimization problem’. The constrained optimization may or may not have any one
of the equality or inequality constraint.

6.2 General Notes on Constrained Optimization


 Any point x(t) is said to have satisfied a constraint if the left side expression of the constraint at that point agrees with the right-side value
by the relational operator between them.
 A point (or solution) is defined as a feasible point (or solution) if all equality and inequality constraints and variable bounds are satisfied
at that point.
 All other points are known as infeasible points.
 It is important to note that an infeasible point can never be the optimum point.
 There are two ways an inequality constraint can be satisfied at a point—the point falls either on the constraint surface (that is, gj(x(t)) = 0)
or on the feasible side of the constraint surface (where gj(x(t)) is positive).
 If the point falls on the constraint surface, the constraint is said to be an active constraint at that point. In the latter case, the constraint is
inactive at the point.

Optimality Criteria for Constrained Optimization problem


6.3 Kuhn-Tucker Conditions
In constrained optimization problems, points satisfying Kuhn-Tucker conditions are likely candidates for the optimum.
In the case of unconstrained optimization problems, not all stationary points are optimal points and in constrained optimization problems, not all
Kuhn-Tucker points (points that satisfy Kuhn-Tucker conditions) are optimal points.
Many constrained optimization algorithms are designed to find Kuhn-Tucker points.

Page 2 of 9
6.4 First order Optimality conditions (KT Conditions)
J K
∇ f ( x )−∑ u j ∇ g j ( x )−∑ v k ∇ hk ( x )=0 (Eq.6.5)
j=1 k=1
g j ( x ) ≥0 , j=1 ,2 , … , J ; (Eq.6.6)
h k ( x )=0 , k =1 ,2 , … , K ; (Eq.6.7)
u j g j ( x )=0 , j=1 , 2, … , J ; (Eq.6.8)
u j ≥ 0 , j=1 , 2 , … , J ; (Eq.6.9)

 The multiplier u jcorresponds to the j-th inequality constraint and the multiplier v k corresponds to the k-th equality constraint.
 There are a total number of J entries in the u-vector and K entries in the v-vector.
 The first equation (Eq.8.5) arises from the optimality condition of the unconstrained Lagrangian function.
 The second (Eq.8.6) Equation satisfying the inequality constraints.
 The third (Eq.8.7) satisfying the equality constraints.
 The fourth equation (Eq.8.8) arises only for inequality constraints. If the j-th inequality constraint is active at a point x (that is, gj(x) = 0),
the product ujgj(x) = 0.
 If an inequality constraint is inactive at a point x (that is, gj(x) > 0), the Lagrange multiplier uj is equal to zero, meaning thereby that in the
neighbourhood of the point x the constraint has no effect on the optimum point.
 The final inequality condition (Eq.8.9) suggests that in the case of active constraints, the corresponding Lagrange multiplier must be
positive.

7.5 KT Points
 A point x(t) and two vectors u(t) and v(t) that satisfy all the above conditions (Eq.6.5 – 6.9) are called Kuhn-Tucker points (also known as
K-T points).
 There are a total of (N + 3J + K) Kuhn-Tucker conditions, as can be seen from Equations Eq.6.5 – 6.9)
 In the above formulation, the variable bounds are also considered as inequality constraints.
 Thus, each variable bound x(i L) ≤ xi ≤ x (U
i
)
can be written in two inequality constraints as follows:
( L)
x i−x i ≥ 0
(U )
x i −x i ≥ 0

Page 3 of 9
Example to Illustrate the KT Conditions (Deb)
2 2
Minimize f ( x )=( x 21+ x 2−11 ) + ( x 1 + x 22−7 )
Subject To
2 2
g1 ( x ) =26− ( x1 −5 ) −x 2 ≥ 0
g2 ( x )=20−4 x 1−x 2 ≥ 0
x1 , x2 ≥ 0
Investigate whether each of the following point is a K-T point or not
x(1) = (1, 5)T , x(2) = (0, 0)T , x(3) = (3, 2)T , and x(4) = (3.396, 0)T

Solution

Graphical Representation of feasible space and design points

Figure 6.1. Graphical Representation of feasible space and design points

At first, we transform the variable bounds to two inequality constraints:

Page 4 of 9
g3(x) = x1 ≥ 0, and
g4(x) = x2 ≥ 0.
Thus, the above problem has four inequality constraints (J = 4) and no equality constraint (K = 0).
No. of KT Conditions
There are two problem variables: N = 2.
Four inequality constraints (J = 4) and
no equality constraint (K = 0).
Thus, for each point a total of (2 + 3 × 4 + 0) or 14 Kuhn-Tucker conditions need to be checked. [(N + 3J + K)]

Formulating K-T Conditions

To formulate all K-T conditions, we first calculate the gradient of the objective function.

for all points: ∇g2(x(t)) = (−4,−1)T , ∇g3(x(t)) = (1, 0)T , and ∇g4(x(t)) = (0, 1)T .]
Table 6.1 Gradient and Constraint Values at Four Different Points for Constrained Function. [The gradients for constraints g2, g3, and g4 are same

Page 5 of 9
Using the values in Table 6.1, we formulate the Kuhn-Tucker conditions in terms of a u-vector and investigate whether a feasible u-vector can be
obtained by satisfying all conditions. If one such vector exists, then the chosen point is a K-T point.

a) Investigating the point x(1) = (1, 5)T a KT point or not

Using the values given in the Table 8.1 and substituting in the KT conditions at point x(1) = (1, 5)T (Eq.6.5 – 6.9). We get the following 14
conditions:

It is clear that the third condition is not satisfied. This is enough to conclude that the point x(1) is not a K-T point. In fact, since the first constraint
value is negative, this constraint is violated at this point and the point x(1) is not a feasible point, as shown in Figure 6.1. If a point is infeasible,
the point cannot be an optimal point.

b) Investigating the point x(2) = (0, 0)T a KT point or not

Using the values given in the Table 8.1 and substituting in the KT conditions at point x(2) = (0, 0)T (Eq.6.5 – 6.9). We get the following 14
conditions:

All but the final set of conditions reveal that u1 = 0, u2 = 0, u3 = −14 and u4 = −22. The final set of conditions are not satisfied with these values,
because u3 and u4 are negative. Thus, the point x(2) is not a K-T point. Since the constraints are not violated, the point is a feasible point, as shown
in Figure 8.1. Thus, the point x(2) cannot be an optimal point.

Page 6 of 9
c) Investigating the point x(3) = (3, 2)T a KT point or not

Using the values given in the Table 6.1 and substituting in the KT conditions at point x(3) = (3, 2)T (Eq.8.5 – 8.9). We get the following 14
conditions:

The vector u∗= (0, 0, 0, 0)T satisfies all the above conditions. Thus, the point x(3) is a K-T point (Figure 8.1). As mentioned earlier, K-T points are
likely candidates for minimal points.

d) Investigating the point x(4) = (3.396, 0))T a KT point or not

Using the values given in the Table 6.1 and substituting in the KT conditions at point x(3) = (3.396, 0))T (Eq.6.5 – 6.9). We get the following 14
conditions:

The solution to the above conditions is the vector u∗= (0, 0, 0, 1)T . Thus, the point x(4) is also a K-T point. It is clear from the figure that the point
x(3) is the minimum point, but the point x(4) is not a minimum point.
Thus, we may conclude from the above exercise problem that a K-T point may or may not be a minimum point. But if a point is not a K-T point
(point x(2) or x(3)), then it cannot be an optimum point.

Page 7 of 9
7.6 Kuhn-Tucker necessity theorem

If x∗ is an optimal solution to NLP, there exists a (u∗, v∗) such that (x∗, u∗, v∗) satisfies Kuhn-Tucker conditions.

7.7 Kuhn-Tucker sufficiency theorem

Let the objective function be convex, the inequality constraints g j(x) be all concave functions for j = 1, 2, . . . , J and equality constraints h k(x),
for k = 1, 2, . . . ,K be linear. If there exists a solution (x∗, u∗, v∗) that satisfies the K-T conditions, then x ∗ is an optimal solution to the NLP
problem.

Illustrating Sufficiency Theorem


2 2
Convex function : f ( x 1 , x 2 ) =( x 1−3 ) −( x2 −2 )
Subject To
2 2
g1 ( x ) =26− ( x1 −5 ) −x 2 ≥ 0
g2 ( x )=20−4 x 1−x 2 ≥ 0
x1 , x2 ≥ 0

Investigating the point x(3) = (3, 2)T a KT point or not

Solution:
The objective function has the following Hessian matrix:

The leading principal determinants are |2| and |∇2f(x)| or 2 and 4, respectively.
Since both these values are positive, the Hessian matrix is positive-definite and the function f(x1, x2) is a convex function.

Let us consider the point x(3) = (3, 2)T .

Page 8 of 9
The point x(3) is a K-T point with a u vector: u∗= (0, 0, 0, 0)T .

The first constraint g1(x) is a concave function because the matrix

is a positive-definite matrix.
The constraint g2(x) is also a concave function because a linear function is both convex and concave.
Thus, we can conclude that the point (3, 2)T is a minimum point of the above constrained problem.
$$$

Page 9 of 9

You might also like