The stability of an orbit of a dynamical system characterizes whether nearby (i.e., perturbed) orbits will remain in a neighborhood of that orbit or be repelled away from it. Asymptotic stability additionally characterizes attraction of nearby orbits to this orbit in the long-time limit. The distinct concept of structural stability is treated elsewhere, and concerns changes in the family of all solutions due to perturbations to the functions defining the dynamical system.
We mainly consider autonomous ordinary differential equations (ODEs), written in vector notation as:
where
We denote a solution to (1) by , with initial conditions .
Equilibria (sometimes called equilibrium points or
fixed points), are special constant solutions where , which is equivalent to requiring for all .
Below, we treat the stability of equilibria in detail, and then
mention extensions to the stability of more general solutions
. We also give some analogous results
for maps.
[edit] Definitions: Stability of an Equilibrium
[edit] Lyapunov stability
is a stable equilibrium if for every
neighborhood of there is a neighborhood
of such that every solution
starting in remains in for all . Notice that need not approach .
If is not stable, it is unstable.
Figure 2: Asymptotic stability: a sink (see below).
[edit] Asymptotic stability
An equilibrium is asymptotically stable if it is Lyapunov stable and additionally can be chosen so that as for all .
An equilibrium that is Lyapunov stable but not asymptotically stable is sometimes called neutrally stable. See Figure 1 and Figure 2 for
illustrations.
[edit] Exponential stability
An equilibrium is exponentially stable if there is a neighborhood of and a constant such that
as for all . Exponentially stable equilibria are also asymptotically stable, and hence Lyapunov stable.
[edit] Linearization
Suppose that is an equilibrium, so that if , then
. Let , where
is a small
perturbation. Substitute
into (1) and expand in a
multivariable, vector-valued Taylor series to obtain:
(We assume that is sufficiently
differentiable so that Taylor's Theorem with remainder applies to
each component.) Here, denotes the Jacobian matrix of partial derivatives , evaluated at the equilibrium .
and denotes terms of quadratic and higher
order in the components .
Specifically, if then
Thus, for small enough , the first
order term
dominates. Taking into account that
and vanish and ignoring the
small term , we get the linear
system:
This is called the linearization of (1). It can be
solved by standard methods (Boyce and DiPrima, 1997).
The general solution of Eqn.
(3) is determined by the eigenvalues and
eigenvectors of the Jacobian matrix .
Here we are concerned with qualitative properties rather than complete
solutions. In particular, in studying stability we want to know
whether the size of solutions grows, stays constant, or shrinks as
. This can usually be answered just by
examining the eigenvalues.
Recall that, if is a real eigenvalue with
eigenvector , then there is a solution to the
linearized equation of the form:
If is a complex conjugate
pair with eigenvectors (where are
real) then
and
are two linearly-independent solutions. In both cases the real part of (almost) determines stability. Since any solution of the linearized equation can be written as a linear superposition of terms of these forms (except in the case of
multiple eigenvalues), we can deduce that
- If all eigenvalues of have strictly negative real parts, then as for all solutions.
- If at least one eigenvalue of has a positive real part, then there is a solution with as .
- If some pairs of complex-conjugate eigenvalues have zero real parts with distinct imaginary parts, then the corresponding solutions oscillate and neither decay nor grow as .
Note: The eigenvalues of the linearization are preserved under (smooth) changes of coordinates (Arnold, 1973).
Note: When multiple eigenvalues exist and there are not enough
linearly-independent eigenvectors to span .
solutions behave like so that they still decay for sufficiently long times if and grow if .
Note: The form implies that transient growth occurs over initial times even if . This can also occur in the case of distinct eigenvalues. See (Trefethen and Embree, 2005) for more on this, but consider the example
for large . This system has eigenvalues and . However, taking , the first coordinate initially grows from zero to a maximum value of . For sufficiently large , the growth of will initially overwhelm the decay of so that the trajectory transiently moves farther from the fixed point before approaching it as . This also illustrates the need for the two neighborhoods and in the definitions of stability.
This motivates the concept of:
[edit] Hyperbolic equilibria
Definition: is a hyperbolic
or non-degenerate equilibrium if all the eigenvalues of
have non-zero real parts.
Equipped with the linear analysis sketched above, and recognizing
that the remainder terms ignored in passing from Eqn.
(2) to (3) can be made as
small as we wish by selecting a sufficiently small neighborhood of
, we can determine the stability of
hyperbolic equilibria from their linearization:
Proposition: If is an equilibrium of and all the eigenvalues of the Jacobian matrix have strictly negative real parts, then is exponentially (and hence asymptotically) stable. If at least one eigenvalue has strictly positive real part, then is unstable.
Moreover, the Hartman-Grobman Theorem says that the full nonlinear system (1) is topologically equivalent to the linearized system (3) in a small neighborhood of a hyperbolic equilibrium.
Borrowing from fluid mechanics, we say that if all nearby solutions
approach an equilibrium (e.g. all eigenvalues have negative real
parts), it is a sink; if all nearby solutions recede from it, it
is a source, and if some approach and some recede, it is a
saddle point. When the equilibrium is surrounded by nested
closed orbits, we call it a center.
[edit] Degenerate Equilibria
One might hope to claim that Lyapunov stability (per the definition above) holds even if (some) eigenvalues have zero real part, but the following counter examples demonstrates that this is not the case:
[edit] Example 1
Consider
Here is the equilibrium and the linearization at 0 is
with solution , so
certainly is Lyapunov stable for Eqn. (5), but not asymptotically stable.
The exact solution of the nonlinear ODE (4) may be found by separating variables:
We therefore deduce that
The linearized system (5) is degenerate and the
nonlinear "remainder terms", ignored in our linearized analysis,
determine the outcome in this case. Here it is obvious, at least in
retrospect, that ignoring these terms is perilous, since while they
are indeed (in fact,
), the linear
term is identically zero!
[edit] Example 2
Consider the two-dimensional system
Note that the linearization is simply a harmonic oscillator with
eigenvalues . Is the equilibrium of this system stable or unstable? To answer this, it
is convenient to transform to polar coordinates , which gives the uncoupled system:
The first equation is as in the example above, so we conclude unstable; Lyapunov stable;
asymptotically stable. Again, the linearization describes stability correctly only if
.
How can we prove stability in such degenerate cases, in which one or
more eigenvalues has zero real part? One method requires
construction of a function, often called a Lyapunov function,
which remains constant, or decreases, along solutions. For
mechanical systems the total (kinetic plus potential) energy is
often a good candidate. This allows one to prove stability and even
asymptotic stability in certain cases, via describe Lyapunov's
second method or direct method:
Theorem: Suppose that has an isolated equilibrium at (without loss of generality one can move an equilibrium to by letting ). If there exists a differentiable function , which is positive definite in a neighborhood of (in the sense that and for ) and for which is negative definite on some domain , then is asymptotically stable. If is negative semidefinite (i.e., is allowed), then is Lyapunov-stable.
For a proof, see, e.g., (Hirsch, Smale, and Devaney (2004)).
[edit] Linearized stability for maps
Analogous results exist for stability of maps of the form
but here the magnitude rather than the sign of the eigenvalues
is important. A solution is called a fixed point if .
Definition: is a hyperbolic or
non-degenerate fixed point of the map
if no eigenvalue of
has magnitude 1.
Proposition: If is a fixed point of the map and all the eigenvalues of have magnitudes strictly strictly less than 1, then is asymptotically stable. If at least one eigenvalue has magnitude greater than 1, then is unstable.
[edit] Example: Logistic map
The following one-dimensional example illustrates this. Consider the
logistic map
which has two fixed points and . The linearization at is
which may be solved to give
Clearly, grows without bound if , and decays to 0 if .
Similarly, we linearize at and obtain
so that this fixed point is asymptotically stable if .
[edit] Stability of General Orbits
Figure 3: Orbital stability.
The notions of stability may be generalized to non-constant orbits
(periodic, quasiperiodic or non-periodic) of ODEs.
First, we give some definitions and notation. Let , given the initial value
. Then, the (forward) orbit
is the set of all values that this
trajectory obtains. Next, we have
Definition: Two orbits and
are
-close if there is a reparameterization of
time (a smooth, monotonic function) such that
We say that an orbit is orbitally stable if all orbits with nearby
initial points remain close in this sense:
Definition: An orbit is
orbitally stable if, for any , there is a
neighborhood of so that, for all
in , and
are -close.
Definition: If additionally may be chosen so
that, for all , there exists a
constant so that
then is asymptotically stable.
See Figure 1-Figure 4, which show (a segment of) the
orbit as well as a
neighboring orbit . The black lines indicate the boundary of an
-neighborhood of .
Figure 4: Asymptotic orbital stability (an -neighborhood of is shown in yellow).
We note that the linearization techniques discussed above for
equilibria and fixed points can be extended to apply to asymptotic
stability of Periodic orbits, as described in the corresponding
article.
[edit] Example: The nonlinear pendulum
Consider the pendulum equations
Orbits lie on the energy level sets shown in
Figure 5. Neighboring orbits have different periods. However, the two orbits animated in the
figure are -close, as the corresponding trajectories
remain close under a reparameterization of time (under which their periods
would become equal). As this is true for all orbits in a neighborhood of either of the animated trajectories, they are both orbitally stable. In fact, all orbits are orbitally stable for this
system, except for the saddle points and their connections.
Figure 5: Orbital stability for the nonlinear pendulum.
[edit] Example: Linear flows on the torus
The flow on the two-torus
is similar to the pendulum example above: here, all orbits are orbitally
stable, as their neighbors are -close under reparameterization of time.
However, upon adding a third coordinate with constant velocity
the situation changes dramatically. Consider two neighboring orbits
with slightly different initial values of . These
two orbits are linear flows on invariant two-tori with different, fixed
values of . Generically, the two flows are
irrational, so that each orbit is dense on its two-torus. Therefore,
the two orbits are close as sets. However, time
cannot be reparameterized so that the orbits will be close under the definition above, because the flows
have different slopes: see Figure 6.
Figure 6: Lack of orbital stability for the three-torus example.
[edit] Example: The two-body problem
Consider the following equations, written in polar coordinates for
, which describe a limiting case of the
gravitational dynamics of two bodies:
As for the last example above, orbits with different initial points are
linear flows on invariant two-tori which generally have different
frequency ratios (i.e., different slopes). Therefore, no orbits
(except for the equilibrium at the origin) are orbitally stable.
[edit] References
We thank both referees for their careful reading and suggestions, and one in in particular for her/his correction of our definition of orbital stability and for providing one of the examples above.
- W.E. Boyce and R.C. DiPrima (1997). Elementary Differential Equations and Boundary Value Problems. Wiley, New York.
- M.W. Hirsch, S. Smale, and R.L. Devaney (2004). Differential Equations, Dynamical Systems and an Introduction to Chaos. Academic Press/Elsevier, San Diego.
- V.I. Arnold (1973). Ordinary Differential Equations. MIT Press, Cambridge, MA.
- L.N. Trefethen and M. Embree (2005). Spectra and Pseudospectra: The Behavior of Nonnormal Matrices and Operators. Princeton Univ. Press, Princeton, NJ.
Internal references
[edit] External Links
[edit] See also
Attractor, Basin of attraction, Bifurcations, Chaos, Equilibrium, Fixed point, Periodic orbit, Unstable periodic orbits, Structural stability