NUMERICAL METHODS
NUMERICAL OPTIMIZATION
Department of Chemical Engineering
Faculty of Industrial Technology
Parahyangan Catholic University
April 2024
1
Introduction
• An optimization or mathematical programming problem is generally stated
as: Find x, which minimizes or maximizes f(x) subject to
d i ( x) ai i 1,2, , m *
Where x is an n-dimensional design vector,
f(x) is the objective function, di(x) are ei ( x) bi i 1,2, , p *
inequality constraints, ei(x) are equality
constraints, and ai and bi are constants
• Optimization problems can be classified on the basis of the form of
f(x):
– If f(x) and the constraints are linear, we have linear programming.
– If f(x) is quadratic and the constraints are linear,
we have quadratic programming.
– If f(x) is not linear or quadratic and/or the constraints are nonlinear,
we have nonlinear programming.
• When equations(*) are included, we have a constrained
optimization problem; otherwise, it is unconstrained optimization
problem. 2
1
One-Dimensional Unconstrained Optimization
• Root finding and optimization are
related. Both involve guessing and
searching for a point on a function.
Difference is:
– Root finding is searching for zeros of a
function
– Optimization is finding the minimum or the
maximum of a function of several
variables.
• In multimodal functions, both local
and global optima can occur. We are
mostly interested in finding the
absolute highest or lowest value of a
function.
Golden Ratio
A unimodal function has a single maximum or a minimum in the a
given interval. For a unimodal function:
• First pick two points that will bracket your extremum [xl, xu].
• Then, pick two more points within this interval to determine whether a
maximum has occurred within the first three or last three points
l1 l2
l0 l1 l2 and
l0 l1
l1 l l2
2 R Golden
l1 l2 l1 l1
Ratio
1
1 R R2 R 1 0
R
1 1 4(1) 5 1
R 0.61803
2 2
4
2
Golden Section Search
• Pick two initial guesses, xl and xu, that bracket one
local extremum of f(x):
5 1
• Choose two interior points d ( xu xl )
2
x1 and x2 according to x1 xl d
the golden ratio x x d 2 u
Evaluate the function at x1 and x2 :
• If f(x1) > f(x2) then the domain of x to the left of x2
(from xl to x2) does not contain the maximum and can
be eliminated. Then, x2 becomes the new xl
• If f(x2) > f(x1), then the domain of x to the right of x1
(from x1 to xu) can be eliminated. In this case, x1
becomes the new xu .
• The benefit of using golden ratio is that we do not
need to recalculate all the function values in the next Stopping Criteria
iteration.
If f(x1) > f(x2) then New x2 x1 else New x1 x2 | xu - x l | < e
5
Example : The Golden-Section Search
x2
Find the maximum of 2 sin x
10
xl f(xl) x2 f(x 2) x1 f(x1) xu f(xu) d
0.0000 0.0000 1.5279 1.7647 2.4721 0.6300 4.0000 -3.1136 2.4721
0.0000 0.0000 0.9443 1.5310 1.5279 1.7647 2.4721 0.6300 1.5279
0.9443 1.5310 1.5279 1.7647 1.8885 1.5432 2.4721 0.6300 0.9443
0.9443 1.5310 1.3050 1.7595 1.5279 1.7647 1.8885 1.5432 0.5836
1.3050 1.7595 1.5279 1.7647 1.6656 1.7136 1.8885 1.5432 0.3607
1.3050 1.7595 1.4427 1.7755 1.5279 1.7647 1.6656 1.7136 0.2229
3
Quadratic Interpolation
Optima of g(x) Optima of f(x)
f(x)
x0 x1 x3 x2 x
Idea:
(i) Approximate f(x) using a quadratic function g(x) = ax2+bx+c
(ii) Optima of f(x) ≈ Optima of g(x)
• Shape near optima typically appears like a parabola. We
can approximate the original function f(x) using a
quadratic function g(x) = ax2 + bx + c.
• At the optimum point of g(x), g'(x) = 2ax + b = 0.
Let x3 be the optimum point, then x3 = -b/2a.
• How to compute b and a?
– 2 points => unique straight line (1st-order polynomial)
– 3 points => unique parabola (2nd-order polynomial)
– So, we need to pick three points that surround the optima.
– Let these points be x0, x1, x2 such that x0 < x1 < x2
4
Quadratic Interpolation
• a and b can be obtained by solving the system of
linear equations
ax 02 bx 0 c f ( x0 )
ax 2
1 bx1 c f ( x1 )
ax 2
2 bx 2 c f ( x2 )
• Substitute a and b into x3 = -b/2a yields
f ( x0 )(x12 x22 ) f ( x1 )(x22 x02 ) f ( x2 )(x02 x12 )
x3
2 f ( x0 )(x1 x2 ) 2 f ( x1 )(x2 x0 ) 2 f ( x2 )(x0 x1 )
Quadratic Interpolation
• The process can be repeated to improve the
approximation.
• Next step, decide which sub-interval to discard
– Since f(x3) > f(x1)
if x3 > x1, discard the interval toward the left of x1
i.e., Set x0 = x1 and x1 = x3
if x3 < x1, discard the interval toward the right of x1
i.e., Set x2 = x1 and x1 = x3
• Calculate x3 based on the new x0, x1, x2
10
10
5
Quadratic Interpolation : Example
1st
iteration
11
Quadratic Interpolation : Example
2nd
iteration
12
6
Characteristics of Optima
To find the optima, we can find the zeroes of f'(x).
13
13
Newton’s Method
Let g(x) = f'(x)
Thus the zeroes of g(x) is the optima of f(x).
Substituting g(x) into the updating formula of Newton-
Rahpson method, we have
g ( xi ) f ' ( xi )
x i 1 x i xi
g ' ( xi ) f " ( xi )
Note: Other root finding methods will also work.
14
14
7
Example :Newton’s Method
Use Newton’s method to find the maximum of
f(x) = 2 sin x – x2/10 with an initial guess of x0 = 2.5
Solution:
f ( xi ) 2 cos xi xi / 5
xi 1 xi xi
f ( xi ) 2 sin xi 1 / 5
2 cos 2.5 2.5 / 5
x1 2.5 0.995 and f(0.995) 1.578
2 sin 2.5 1 / 5
i x f(x) f’(x) f”(x)
0 2.5 0.572 -2.102 -1.3969
1 0.995 1.578 0.8898 -1.8776
2 1.469 1.774 -0.0905 -2.1896
3 1.4276 1.77573 -0.0002 -2.17954
4 1.4275 1.77573 0.0000 -2.17952 15
15
Newton’s Method
• Shortcomings
– Need to derive f'(x) and f"(x).
– May diverge
– May "jump" to another solution far away
• Advantages
– Fast convergent rate near solution
– Hybrid approach: Use bracketing method to find an
approximation near the solution, then switch to
Newton's method.
16
16
8
Summary
• Basics
– Minimize f(x) = Maximize -f(x)
– If f'(x) exists, then to find the optima of f(x), we can find
the zero of f'(x).
• Beware of inflection points of f(x)
• Bracketing methods
– Golden-Section Search and Quadratic Interpolation
– How to select points and discard intervals
17
17
Exercises
• Employ the following methods to find the
maximum of f(x) = 4x − 1.8x2 + 1.2x3 − 0.3x4
for three iterations
(a) Golden-section search (xl = 2, xu = 4)
(b) Newton’s method (x0 = 3)
18