Nonlinear Programming
Direct Search
and
Gradient Methods
Line Search:
• Line search techniques are, in essence, optimization
algorithms for one-dimensional minimization problems.
• They are often regarded as the backbones of nonlinear
optimization algorithms.
• Typically, these techniques search a bracketed interval.
• Often, unimodality is assumed.
• Unimodality is a term used in several contexts in
mathematics.
• Originally, it relates to possessing a unique node.
Anil Kumar | Nonlinear optimization 2
Unimodal function
• An unimodal function has only one peak in a given
interval.
• Thus, a function of one variable is said to be unimodal
on a given interval [a, b] if it has either a unique
minimum or maximum on [a, b].
• Mathematically:
Let x* be a minimum point of the function f(x), which is
unimodal on [a, b], if for x1 < x2
(i) f(x1) > f(x*) for x1 < x* and
(ii) f(x*) < f(x2) for x2 > x*.
Anil Kumar | Nonlinear optimization 3
Example
• The quadratic function f(x) = ax2 + bx + c, where a ≠ 0. If the
coefficient a is +ve, the function opens upward and has a single
minimum point. If a is -ve, the function opens downward and has
a single maximum point.
• The Gaussian or normal distribution function: a bell-shaped
curve, which has a single maximum point at the mean and
decreases symmetrically on both sides.
• The logistic function: It is often used in modelling growth
processes. The logistic function starts increasing, reaches a
maximum growth rate, and then decreases.
Anil Kumar | Nonlinear optimization 4
Direct Search Method/Line Search
• The idea of direct search methods is to identify the
interval of uncertainty that is known to include the
optimum solution point.
• Measure of effectiveness: Let
L1 : width of the interval of uncertainty and
Ln : width of the interval of uncertainty after n
experiments.
Then measure of the effectiveness of any search technique
is defined as "
! = ! "!
"!
Anil Kumar | Nonlinear optimization 5
• The procedure locates the optimum by iteratively
narrowing the interval of uncertainty to any desired level
of accuracy.
• Two algorithms to study:
– Fibonacci Search Method
– Golden Section Method.
Anil Kumar | Nonlinear optimization 6
Fibonacci Search Method
• The search interval is reduced according to Fibonacci
numbers.
• Fibonacci numbers are calculated as follows:
"! = "! !! + "! ! " $ %&'(' ! " "$ "# = ! )*+ "! = !,
• The first few Fibonacci numbers are
!! = "* +!" = "* +!# = #* +!$ = $* +!% = &* +!& = )* +!' = "$* +!( = #"* +!) = $%*…
• The property of Fibonacci numbers is used to create a
search algorithm that requires only one function
evaluation at each iteration.
Anil Kumar | Nonlinear optimization 7
Fibonacci Search Method
• Let L1 = b – a, then
choose x1 and x2 in
L1 such that
" a x1 x2 b
#" = $ + ! ! ! %"
"!
"! !"
#! = $ + %"
"!
Anil Kumar | Nonlinear optimization 8
Anil Kumar | Nonlinear optimization 9
Example 1
• Find the minimum of x2 - 2x, 0 ≤ x ≤ 1.5 within the
interval of uncertainty 0.25L1, where L1 is the original
interval of uncertainty.
Anil Kumar | Nonlinear optimization 10
Example 2:
• Find minimum of (x-1)(x-2)(x-3), 1 ≤ x ≤ 3 within the
interval of uncertainty 0.13 L1, where L1 is the original
interval of uncertainty.
Anil Kumar | Nonlinear optimization 11
Golden Section Search Method
• One difficulty of the Fibonacci search method is that
the Fibonacci numbers have to be calculated and stored.
• Another problem is that at every iteration, the
proportion of the eliminated region is different.
• The golden section search method is used to overcome
these two problems and yet to calculate one new
function evaluation per iteration.
Anil Kumar | Nonlinear optimization 13
Golden Section Search Method
• For large N, Fibonacci fraction b converges to golden
"! !!
section ratio τ (0.618034…): "!
• Just like the Fibonacci method, this method maintains a
uniform reduction strategy:
" ! +!
" ! = " ! +! + " ! + " # $%&D(% ! =
"!
• τ = 0.618
• This number is referred to as the golden section ratio.
Anil Kumar | Nonlinear optimization 14
Golden Section Search Method
• Let L1 = b – a, then
choose x1 and x2 in
a x1 x2 b
L1 such that
!! = " + #! " ! $ #! %& !! = $ " ! #!
!" = " + ! #! %& !" = " + ! #!
Anil Kumar | Nonlinear optimization 15
Anil Kumar | Nonlinear optimization 16
Example:
Use the Golden Section Search method to find the
minimum of
– f(x) = x2 - 2x, 0 ≤ x ≤ 1.5 within the interval of
uncertainty 0.25 L1.
(Exact solution is x* = 1, f(x*) = -1)
– f(x) = (x-1)(x-2)(x-3), 1 ≤ x ≤ 3 within the interval
of uncertainty 0.13 L1.
(Exact solution is x* = 1, f(x*) = 0)
Anil Kumar | Nonlinear optimization 17