Nonlinear Control
Lecture # 2
Stability of Equilibrium Points
Nonlinear Control Lecture # 2 Stability of Equilibrium Points
Basic Concepts
x = f (x)
f is locally Lipschitz over a domain D Rn
Suppose x D is an equilibrium point; that is, f (
x) = 0
Characterize and study the stability of x
For convenience, we state all definitions and theorems for the
case when the equilibrium point is at the origin of Rn ; that is,
x = 0. No loss of generality
y = x x
def
y = x = f (x) = f (y + x) = g(y),
Nonlinear Control Lecture # 2 Stability of Equilibrium Points
where g(0) = 0
Definition 3.1
The equilibrium point x = 0 of x = f (x) is
stable if for each > 0 there is > 0 (dependent on )
such that
kx(0)k < kx(t)k < , t 0
unstable if it is not stable
asymptotically stable if it is stable and can be chosen
such that
kx(0)k < lim x(t) = 0
t
Nonlinear Control Lecture # 2 Stability of Equilibrium Points
Scalar Systems (n = 1)
The behavior of x(t) in the neighborhood of the origin can be
determined by examining the sign of f (x)
The requirement for stability is violated if xf (x) > 0 on
either side of the origin
f(x)
f(x)
Unstable
f(x)
Unstable
Nonlinear Control Lecture # 2 Stability of Equilibrium Points
Unstable
The origin is stable if and only if xf (x) 0 in some
neighborhood of the origin
f(x)
Stable
f(x)
f(x)
Stable
Nonlinear Control Lecture # 2 Stability of Equilibrium Points
Stable
The origin is asymptotically stable if and only if xf (x) < 0 in
some neighborhood of the origin
f(x)
f(x)
(a)
Asymptotically Stable
Nonlinear Control Lecture # 2 Stability of Equilibrium Points
(b)
Globally Asymptotically Stable
Definition 3.2
Let the origin be an asymptotically stable equilibrium point of
the system x = f (x), where f is a locally Lipschitz function
defined over a domain D Rn ( 0 D)
The region of attraction (also called region of asymptotic
stability, domain of attraction, or basin) is the set of all
points x0 in D such that the solution of
x = f (x),
x(0) = x0
is defined for all t 0 and converges to the origin as t
tends to infinity
The origin is globally asymptotically stable if the region of
attraction is the whole space Rn
Nonlinear Control Lecture # 2 Stability of Equilibrium Points
Example: Tunnel Diode Circuit
x
1.6
1.4
1.2
1
0.8
Q
1
Q
2
0.6
0.4
0.2
Q
3
0
x
0.2
0.4
0.4
0.2
0.2
0.4
Nonlinear Control Lecture # 2 Stability of Equilibrium Points
0.6
0.8
1.2
1.4
1.6
Linear Time-Invariant Systems
x = Ax
x(t) = exp(At)x(0)
P 1AP = J = block diag[J1 , J2 , . . . , Jr ]
i 1 0 . . . . . . 0
0 i 1 0 . . . 0
..
..
..
.
.
.
Ji =
..
...
.
0
.
.
.
.
. 1
0 . . . . . . . . . 0 i mm
Nonlinear Control Lecture # 2 Stability of Equilibrium Points
exp(At) = P exp(Jt)P
mi
r X
X
tk1 exp(i t)Rik
i=1 k=1
mi is the order of the Jordan block Ji
Re[i ] < 0 i Asymptotically Stable
Re[i ] > 0 for some i Unstable
Re[i ] 0 i & mi > 1 for Re[i ] = 0 Unstable
Re[i ] 0 i & mi = 1 for Re[i ] = 0 Stable
If an n n matrix A has a repeated eigenvalue i of algebraic
multiplicity qi , then the Jordan blocks of i have order one if
and only if rank(A i I) = n qi
Nonlinear Control Lecture # 2 Stability of Equilibrium Points
Theorem 3.1
The equilibrium point x = 0 of x = Ax is stable if and only if
all eigenvalues of A satisfy Re[i ] 0 and for every eigenvalue
with Re[i ] = 0 and algebraic multiplicity qi 2,
rank(A i I) = n qi , where n is the dimension of x. The
equilibrium point x = 0 is globally asymptotically stable if and
only if all eigenvalues of A satisfy Re[i ] < 0
When all eigenvalues of A satisfy Re[i ] < 0, A is called a
Hurwitz matrix
When the origin of a linear system is asymptotically stable, its
solution satisfies the inequality
kx(t)k kkx(0)ket ,
Nonlinear Control Lecture # 2 Stability of Equilibrium Points
t 0, k 1, > 0
Exponential Stability
Definition 3.3
The equilibrium point x = 0 of x = f (x) is exponentially
stable if
kx(t)k kkx(0)ket , t 0
k 1, > 0, for all kx(0)k < c
It is globally exponentially stable if the inequality is satisfied
for any initial state x(0)
Exponential Stability Asymptotic Stability
Nonlinear Control Lecture # 2 Stability of Equilibrium Points
Example 3.2
x = x3
The origin is asymptotically stable
x(0)
x(t) = p
1 + 2tx2 (0)
x(t) does not satisfy |x(t)| ket |x(0)| because
|x(t)| ket |x(0)|
e2t
k2
1 + 2tx2 (0)
e2t
=
t 1 + 2tx2 (0)
Impossible because lim
Nonlinear Control Lecture # 2 Stability of Equilibrium Points
Linearization
x = f (x),
f (0) = 0
f is continuously differentiable over D = {kxk < r}
J(x) =
f
(x)
x
h() = f (x) for 0 1,
h(1) h(0) =
h () = J(x)x
h () d,
h(0) = f (0) = 0
f (x) =
Nonlinear Control Lecture # 2 Stability of Equilibrium Points
J(x) d x
f (x) =
J(x) d x
Set A = J(0) and add and subtract Ax
Z 1
f (x) = [A + G(x)]x, where G(x) =
[J(x) J(0)] d
0
G(x) 0 as x 0
This suggests that in a small neighborhood of the origin we
can approximate the nonlinear system x = f (x) by its
linearization about the origin x = Ax
Nonlinear Control Lecture # 2 Stability of Equilibrium Points
Theorem 3.2
The origin is exponentially stable if and only if Re[i ] < 0
for all eigenvalues of A
The origin is unstable if Re[i ] > 0 for some i
Linearization fails when Re[i ] 0 for all i, with Re[i ] = 0
for some i
Example 3.3
3
x = ax ,
f
2
=
3ax
=0
A=
x=0
x x=0
Stable if a = 0; Asymp stable if a < 0; Unstable if a > 0
When a < 0, the origin is not exponentially stable
Nonlinear Control Lecture # 2 Stability of Equilibrium Points
Lyapunovs Method
Let V (x) be a continuously differentiable function defined in a
domain D Rn ; 0 D. The derivative of V along the
trajectories of x = f (x) is
V (x) =
n
X
V
i=1
xi
V
,
x1
x i =
n
X
V
i=1
V
,
x2
V
f (x)
x
Nonlinear Control Lecture # 2 Stability of Equilibrium Points
... ,
xi
fi (x)
V
xn
f1 (x)
f2 (x)
..
.
fn (x)
If (t; x) is the solution of x = f (x) that starts at initial state
x at time t = 0, then
d
V (x) = V ((t; x))
dt
t=0
If V (x) is negative, V will decrease along the solution of
x = f (x)
If V (x) is positive, V will increase along the solution of
x = f (x)
Nonlinear Control Lecture # 2 Stability of Equilibrium Points
Lyapunovs Theorem (3.3)
If there is V (x) such that
V (0) = 0 and V (x) > 0,
V (x) 0,
x D with x 6= 0
xD
then the origin is a stable
Moreover, if
V (x) < 0,
x D with x 6= 0
then the origin is asymptotically stable
Nonlinear Control Lecture # 2 Stability of Equilibrium Points
Furthermore, if V (x) > 0, x 6= 0,
kxk V (x)
and V (x) < 0, x 6= 0, then the origin is globally
asymptotically stable
Nonlinear Control Lecture # 2 Stability of Equilibrium Points
Proof
D
0 < r , Br = {kxk r}
= min V (x) > 0
kxk=r
0<<
= {x Br | V (x) }
kxk V (x) <
Nonlinear Control Lecture # 2 Stability of Equilibrium Points
Solutions starting in stay in because V (x) 0 in
x(0) B x(0) x(t) x(t) Br
kx(0)k < kx(t)k < r , t 0
The origin is stable
Now suppose V (x) < 0 x D, x 6= 0. V (x(t) is
monotonically decreasing and V (x(t)) 0
lim V (x(t)) = c 0 Show that c = 0
t
Suppose c > 0. By continuity of V (x), there is d > 0 such
that Bd c . Then, x(t) lies outside Bd for all t 0
Nonlinear Control Lecture # 2 Stability of Equilibrium Points
= max V (x)
dkxkr
Z t
V (x(t)) = V (x(0)) +
V (x( )) d V (x(0)) t
0
This inequality contradicts the assumption c > 0
The origin is asymptotically stable
The condition kxk V (x) implies that the set
c = {x Rn | V (x) c} is compact for every c > 0. This is
so because for any c > 0, there is r > 0 such that V (x) > c
whenever kxk > r. Thus, c Br . All solutions starting c
will converge to the origin. For any point p Rn , choosing
c = V (p) ensures that p c
The origin is globally asymptotically stable
Nonlinear Control Lecture # 2 Stability of Equilibrium Points
Terminology
V (0) = 0, V (x) 0 for x 6= 0 Positive semidefinite
V (0) = 0, V (x) > 0 for x 6= 0
Positive definite
V (0) = 0, V (x) 0 for x 6= 0 Negative semidefinite
V (0) = 0, V (x) < 0 for x 6= 0
Negative definite
kxk V (x)
Radially unbounded
Lyapunov Theorem
The origin is stable if there is a continuously differentiable
positive definite function V (x) so that V (x) is negative
semidefinite, and it is asymptotically stable if V (x) is negative
definite. It is globally asymptotically stable if the conditions
for asymptotic stability hold globally and V (x) is radially
unbounded
Nonlinear Control Lecture # 2 Stability of Equilibrium Points
A continuously differentiable function V (x) satisfying the
conditions for stability is called a Lyapunov function. The
surface V (x) = c, for some c > 0, is called a Lyapunov surface
or a level surface
c3
c2
V (x) = c 1
c 1 <c 2 <c 3
Nonlinear Control Lecture # 2 Stability of Equilibrium Points
Why do we need the radial unboundedness condition to show
global asymptotic stability?
It ensures that c = {V (x) c} is bounded for every c > 0.
Without it c might not bounded for large c
Example
V (x) =
x2 c
x21
+x2
1 + x21 2
c
c
c
c
c
hhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhh
c
x1
c
c
c
c
c
c
Nonlinear Control Lecture # 2 Stability of Equilibrium Points
Example: Pendulum equation without friction
x 1 = x2 ,
x 2 = a sin x1
V (x) = a(1 cos x1 ) + 21 x22
V (0) = 0 and V (x) is positive definite over the domain
2 < x1 < 2
V (x) = ax 1 sin x1 + x2 x 2 = ax2 sin x1 ax2 sin x1 = 0
The origin is stable
Since V (x) 0, the origin is not asymptotically stable
Nonlinear Control Lecture # 2 Stability of Equilibrium Points
Example: Pendulum equation with friction
x 1 = x2 ,
x 2 = a sin x1 bx2
1
V (x) = a(1 cos x1 ) + x22
2
V (x) = ax 1 sin x1 + x2 x 2 = bx22
The origin is stable
V (x) is not negative definite because V (x) = 0 for x2 = 0
irrespective of the value of x1
Nonlinear Control Lecture # 2 Stability of Equilibrium Points
The conditions of Lyapunovs theorem are only sufficient.
Failure of a Lyapunov function candidate to satisfy the
conditions for stability or asymptotic stability does not mean
that the equilibrium point is not stable or asymptotically
stable. It only means that such stability property cannot be
established by using this Lyapunov function candidate
Try
V (x) =
1 T
x Px
2
+ a(1 cos x1)
p11 p12
x1
1
= 2 [x1 x2 ]
+ a(1 cos x1 )
p12 p22
x2
p11 > 0,
p11 p22 p212 > 0
Nonlinear Control Lecture # 2 Stability of Equilibrium Points
V (x) = (p11 x1 + p12 x2 + a sin x1 ) x2
+ (p12 x1 + p22 x2 ) (a sin x1 bx2 )
= a(1 p22 )x2 sin x1 ap12 x1 sin x1
+ (p11 p12 b) x1 x2 + (p12 p22 b) x22
p22 = 1,
p11 = bp12 0 < p12 < b,
Take p12 = b/2
V (x) = 21 abx1 sin x1 21 bx22
D = {|x1 | < }
V (x) is positive definite and V (x) is negative definite over D.
The origin is asymptotically stable
Nonlinear Control Lecture # 2 Stability of Equilibrium Points
Variable Gradient Method
V
f (x) = g T (x)f (x)
V (x) =
x
g(x) = V = (V /x)T
Choose g(x) as the gradient of a positive definite function
V (x) that would make V (x) negative definite
g(x) is the gradient of a scalar function if and only if
gj
gi
=
, i, j = 1, . . . , n
xj
xi
Choose g(x) such that g T (x)f (x) is negative definite
Nonlinear Control Lecture # 2 Stability of Equilibrium Points
Compute the integral
V (x) =
x
T
g (y) dy =
n
X
gi (y) dyi
i=1
over any path joining the origin to x; for example
Z x1
Z x2
g2 (x1 , y2 , 0, . . . , 0) dy2
g1 (y1 , 0, . . . , 0) dy1 +
V (x) =
0
0
Z xn
gn (x1 , x2 , . . . , xn1 , yn ) dyn
+ +
0
Leave some parameters of g(x) undetermined and choose
them to make V (x) positive definite
Nonlinear Control Lecture # 2 Stability of Equilibrium Points
Example 3.7 (Read)
x 1 = x2 ,
x 2 = h(x1 ) ax2
a > 0, h() is locally Lipschitz,
h(0) = 0;
yh(y) > 0 y 6= 0, y (b, c),
b > 0, c > 0
g2
g1
=
x2
x1
V (x) = g1 (x)x2 g2 (x)[h(x1 ) + ax2 ] < 0, for x 6= 0
V (x) =
g T (y) dy > 0, for x 6= 0
0
Nonlinear Control Lecture # 2 Stability of Equilibrium Points
Try
g(x) =
1 (x1 ) + 1 (x2 )
2 (x1 ) + 2 (x2 )
To satisfy the symmetry requirement, we must have
2
1
=
x2
x1
1 (x2 ) = x2
and 2 (x1 ) = x1
V (x) = x1 h(x1 ) ax2 2 (x2 ) + x22
+ x2 1 (x1 ) ax1 x2 2 (x2 )h(x1 )
Nonlinear Control Lecture # 2 Stability of Equilibrium Points
To cancel the cross-product terms, take
2 (x2 ) = x2
g(x) =
V (x) =
and 1 (x1 ) = ax1 + h(x1 )
ax1 + h(x1 ) + x2
x1 + x2
x1
x2
[ay1 + h(y1 )] dy1 +
(x1 + y2 ) dy2
0
0
Z x1
2
1
h(y) dy + x1 x2 + 12 x22
= 2 ax1 +
0
Z x1
a
1 T
= 2x P x +
h(y) dy, P =
0
Nonlinear Control Lecture # 2 Stability of Equilibrium Points
V (x) =
1 T
x Px
2
x1
h(y) dy,
0
P =
V (x) = x1 h(x1 ) (a )x22
Choose
> 0 and 0 < < a
If yh(y) > 0 holds for all y 6= 0, the conditions of Lyapunovs
theorem hold globally and V (x) is radially unbounded
Nonlinear Control Lecture # 2 Stability of Equilibrium Points