The logistic map is a polynomial mapping (equivalently, recurrence relation) of degree 2, often referred to as an archetypal example of how complex, chaotic behaviour can arise from very simple nonlinear dynamical equations. The map was popularized in a 1976 paper by the biologist Robert May,[1] in part as a discrete-time demographic model analogous to the logistic equation written down by Pierre François Verhulst.[2] Mathematically, the logistic map is written
[math]\displaystyle{ x_{n+1} = r x_n (1 - x_n), }[/math] |
|
( ) |
where xn is a number between zero and one, which represents the ratio of existing population to the maximum possible population. This nonlinear difference equation is intended to capture two effects:
The usual values of interest for the parameter r are those in the interval [0, 4], so that xn remains bounded on [0, 1]. The r = 4 case of the logistic map is a nonlinear transformation of both the bit-shift map and the μ = 2 case of the tent map. If r > 4, this leads to negative population sizes. (This problem does not appear in the older Ricker model, which also exhibits chaotic dynamics.) One can also consider values of r in the interval [−2, 0], so that xn remains bounded on [−0.5, 1.5].[3]
The image below shows the amplitude and frequency content of some logistic map iterates for parameter values ranging from 2 to 4.
By varying the parameter r, the following behavior is observed:
For any value of r there is at most one stable cycle. If a stable cycle exists, it is globally stable, attracting almost all points.[11]:13 Some values of r with a stable cycle of some period have infinitely many unstable cycles of various periods.
The bifurcation diagram at right summarizes this. The horizontal axis shows the possible values of the parameter r while the vertical axis shows the set of values of x visited asymptotically from almost all initial conditions by the iterates of the logistic equation with that r value.
The bifurcation diagram is a self-similar: if we zoom in on the above-mentioned value r ≈ 3.82843 and focus on one arm of the three, the situation nearby looks like a shrunk and slightly distorted version of the whole diagram. The same is true for all other non-chaotic points. This is an example of the deep and ubiquitous connection between chaos and fractals.
We can also consider negative values of r:
The relative simplicity of the logistic map makes it a widely used point of entry into a consideration of the concept of chaos. A rough description of chaos is that chaotic systems exhibit a great sensitivity to initial conditions—a property of the logistic map for most values of r between about 3.57 and 4 (as noted above).[1] A common source of such sensitivity to initial conditions is that the map represents a repeated folding and stretching of the space on which it is defined. In the case of the logistic map, the quadratic difference equation describing it may be thought of as a stretching-and-folding operation on the interval (0,1).[12]
The following figure illustrates the stretching and folding over a sequence of iterates of the map. Figure (a), left, shows a two-dimensional Poincaré plot of the logistic map's state space for r = 4, and clearly shows the quadratic curve of the difference equation (1). However, we can embed the same sequence in a three-dimensional state space, in order to investigate the deeper structure of the map. Figure (b), right, demonstrates this, showing how initially nearby points begin to diverge, particularly in those regions of xt corresponding to the steeper sections of the plot.
This stretching-and-folding does not just produce a gradual divergence of the sequences of iterates, but an exponential divergence (see Lyapunov exponents), evidenced also by the complexity and unpredictability of the chaotic logistic map. In fact, exponential divergence of sequences of iterates explains the connection between chaos and unpredictability: a small error in the supposed initial state of the system will tend to correspond to a large error later in its evolution. Hence, predictions about future states become progressively (indeed, exponentially) worse when there are even very small errors in our knowledge of the initial state. This quality of unpredictability and apparent randomness led the logistic map equation to be used as a pseudo-random number generator in early computers.[12]
At r = 2, the function [math]\displaystyle{ rx(1-x) }[/math] intersects [math]\displaystyle{ y = x }[/math] precisely at the maximum point, so convergence to the equilibrium point is on the order of [math]\displaystyle{ \delta^{2^n} }[/math]. Consequently, the equilibrium point is called "superstable". Its Lyapunov exponent is [math]\displaystyle{ -\infty }[/math]. A similar argument shows that there is a superstable [math]\displaystyle{ r }[/math] value within each interval where the dynamical system has a stable cycle. This can be seen in the Lyapunov exponent plot as sharp dips.[13]
Since the map is confined to an interval on the real number line, its dimension is less than or equal to unity. Numerical estimates yield a correlation dimension of 0.500±0.005 (Grassberger, 1983), a Hausdorff dimension of about 0.538 (Grassberger 1981), and an information dimension of approximately 0.5170976 (Grassberger 1983) for r ≈ 3.5699456 (onset of chaos). Note: It can be shown that the correlation dimension is certainly between 0.4926 and 0.5024.
It is often possible, however, to make precise and accurate statements about the likelihood of a future state in a chaotic system. If a (possibly chaotic) dynamical system has an attractor, then there exists a probability measure that gives the long-run proportion of time spent by the system in the various regions of the attractor. In the case of the logistic map with parameter r = 4 and an initial state in (0,1), the attractor is also the interval (0,1) and the probability measure corresponds to the beta distribution with parameters a = 0.5 and b = 0.5. Specifically,[14] the invariant measure is
Unpredictability is not randomness, but in some circumstances looks very much like it. Hence, and fortunately, even if we know very little about the initial state of the logistic map (or some other chaotic system), we can still say something about the distribution of states arbitrarily far into the future, and use this knowledge to inform decisions based on the state of the system.
The Bifurcation diagram for the logistic map can be visualized with the following Python code:
import numpy as np import matplotlib.pyplot as plt interval = (2.8, 4) # start, end accuracy = 0.0001 reps = 600 # number of repetitions numtoplot = 200 lims = np.zeros(reps) fig, biax = plt.subplots() fig.set_size_inches(16, 9) lims[0] = np.random.rand() for r in np.arange(interval[0], interval[1], accuracy): for i in range(reps - 1): lims[i + 1] = r * lims[i] * (1 - lims[i]) biax.plot([r] * numtoplot, lims[reps - numtoplot :], "b.", markersize=0.02) biax.set(xlabel="r", ylabel="x", title="logistic map") plt.show()
Although exact solutions to the recurrence relation are only available in a small number of cases, a closed-form upper bound on the logistic map is known when 0 ≤ r ≤ 1.[15] There are two aspects of the behavior of the logistic map that should be captured by an upper bound in this regime: the asymptotic geometric decay with constant r, and the fast initial decay when x0 is close to 1, driven by the (1 − xn) term in the recurrence relation. The following bound captures both of these effects:
The special case of r = 4 can in fact be solved exactly, as can the case with r = 2;[16] however, the general case can only be predicted statistically.[17] The solution when r = 4 is,[16][18]
where the initial condition parameter θ is given by
For rational θ, after a finite number of iterations xn maps into a periodic sequence. But almost all θ are irrational, and, for irrational θ, xn never repeats itself – it is non-periodic. This solution equation clearly demonstrates the two key features of chaos – stretching and folding: the factor 2n shows the exponential growth of stretching, which results in sensitive dependence on initial conditions, while the squared sine function keeps xn folded within the range [0,1].
For r = 4 an equivalent solution in terms of complex numbers instead of trigonometric functions is[19]
where α is either of the complex numbers
with modulus equal to 1. Just as the squared sine function in the trigonometric solution leads to neither shrinkage nor expansion of the set of points visited, in the latter solution this effect is accomplished by the unit modulus of α.
By contrast, the solution when r = 2 is[19]
for x0 ∈ [0,1). Since (1 − 2x0) ∈ (−1,1) for any value of x0 other than the unstable fixed point 0, the term (1 − 2x0)2n goes to 0 as n goes to infinity, so xn goes to the stable fixed point 1/2.
For the r = 4 case, from almost all initial conditions the iterate sequence is chaotic. Nevertheless, there exist an infinite number of initial conditions that lead to cycles, and indeed there exist cycles of length k for all integers k > 0. We can exploit the relationship of the logistic map to the dyadic transformation (also known as the bit-shift map) to find cycles of any length. If x follows the logistic map xn + 1 = 4xn(1 − xn) and y follows the dyadic transformation
then the two are related by a homeomorphism
The reason that the dyadic transformation is also called the bit-shift map is that when y is written in binary notation, the map moves the binary point one place to the right (and if the bit to the left of the binary point has become a "1", this "1" is changed to a "0"). A cycle of length 3, for example, occurs if an iterate has a 3-bit repeating sequence in its binary expansion (which is not also a one-bit repeating sequence): 001, 010, 100, 110, 101, or 011. The iterate 001001001... maps into 010010010..., which maps into 100100100..., which in turn maps into the original 001001001...; so this is a 3-cycle of the bit shift map. And the other three binary-expansion repeating sequences give the 3-cycle 110110110... → 101101101... → 011011011... → 110110110.... Either of these 3-cycles can be converted to fraction form: for example, the first-given 3-cycle can be written as 1/7 → 2/7 → 4/7 → 1/7. Using the above translation from the bit-shift map to the [math]\displaystyle{ r = 4 }[/math] logistic map gives the corresponding logistic cycle 0.611260467... → 0.950484434... → 0.188255099... → 0.611260467.... We could similarly translate the other bit-shift 3-cycle into its corresponding logistic cycle. Likewise, cycles of any length k can be found in the bit-shift map and then translated into the corresponding logistic cycles.
However, since almost all numbers in [0,1) are irrational, almost all initial conditions of the bit-shift map lead to the non-periodicity of chaos. This is one way to see that the logistic r = 4 map is chaotic for almost all initial conditions.
The number of cycles of (minimal) length k = 1, 2, 3,… for the logistic map with r = 4 (tent map with μ = 2) is a known integer sequence (sequence A001037 in the OEIS): 2, 1, 2, 3, 6, 9, 18, 30, 56, 99, 186, 335, 630, 1161.... This tells us that the logistic map with r = 4 has 2 fixed points, 1 cycle of length 2, 2 cycles of length 3 and so on. This sequence takes a particularly simple form for prime k: 2 ⋅ 2k − 1 − 1/k. For example: 2 ⋅ 213 − 1 − 1/13 = 630 is the number of cycles of length 13. Since this case of the logistic map is chaotic for almost all initial conditions, all of these finite-length cycles are unstable.
Universality of one-dimensional maps with parabolic maxima and Feigenbaum constants [math]\displaystyle{ \delta=4.669201... }[/math], [math]\displaystyle{ \alpha=2.502907... }[/math] [20][21] is well visible with map proposed as a toy model for discrete laser dynamics: [math]\displaystyle{ x \rightarrow G x (1 - \tanh (x)) }[/math], where [math]\displaystyle{ x }[/math] stands for electric field amplitude, [math]\displaystyle{ G }[/math][22] is laser gain as bifurcation parameter.
The gradual increase of [math]\displaystyle{ G }[/math] at interval [math]\displaystyle{ [0, \infty) }[/math] changes dynamics from regular to chaotic one [23] with qualitatively the same bifurcation diagram as those for logistic map.
The Feigenbaum constants can be estimated by a renormalization argument. (Section 10.7, [13]).
By universality, we can use another family of functions that also undergoes repeated period-doubling on its route to chaos, and even though it is not exactly the logistic map, it would still yield the same Feigenbaum constants.
Define the family [math]\displaystyle{ f_r(x) = -(1+r)x + x^2 }[/math]The family has an equilibrium point at zero, and as [math]\displaystyle{ r }[/math] increases, it undergoes period-doubling bifurcation at [math]\displaystyle{ r = r_0, r_1, r_2, ... }[/math].
The first bifurcation occurs at [math]\displaystyle{ r = r_0 = 0 }[/math]. After the period-doubling bifurcation, we can solve for the period-2 stable orbit by [math]\displaystyle{ f_r(p) = q, f_r(q) = p }[/math], which yields [math]\displaystyle{ \begin{cases} p = \frac 12 (r + \sqrt{r(r+4)}) \\ q = \frac 12 (r - \sqrt{r(r+4)}) \end{cases} }[/math]At some point [math]\displaystyle{ r = r_1 }[/math], the period-2 stable orbit undergoes period-doubling bifurcation again, yielding a period-4 stable orbit. In order to find out what the stable orbit is like, we "zoom in" around the region of [math]\displaystyle{ x = p }[/math], using the affine transform [math]\displaystyle{ T(x) = x/c + p }[/math]. Now, by routine algebra, we have[math]\displaystyle{ (T^{-1}\circ f_r^2 \circ T)(x) = -(1+S(r)) x + x^2 + O(x^3) }[/math]where [math]\displaystyle{ S(r) = r^2 + 4r - 2, c = r^2 + 4r - 3\sqrt{r(r+4)} }[/math]. At approximately [math]\displaystyle{ S(r) = 0 }[/math], the second bifurcation occurs, thus [math]\displaystyle{ S(r_1) \approx 0 }[/math].
By self-similarity, the third bifurcation when [math]\displaystyle{ S(r) \approx r_1 }[/math], and so on. Thus we have [math]\displaystyle{ r_n \approx S(r_{n+1}) }[/math], or [math]\displaystyle{ r_{n+1} \approx \sqrt{r_{n}+6}-2 }[/math]. Iterating this map, we find [math]\displaystyle{ r_\infty = \lim_n r_n \approx \lim_n S^{-n}(0) = \frac 12(\sqrt{17}-3) }[/math], and [math]\displaystyle{ \lim_n \frac{r_\infty - r_n}{r_\infty - r_{n+1}} \approx S'(r_\infty) \approx 1 + \sqrt{17} }[/math].
Thus, we have the estimates [math]\displaystyle{ \delta \approx 1+\sqrt{17} = 5.12... }[/math], and [math]\displaystyle{ \alpha \approx r_\infty^2 +4r_\infty- 3 \sqrt{r_\infty^2+4r_\infty} \approx -2.24... }[/math]. These are within 10% of the true values.
Original source: https://en.wikipedia.org/wiki/Logistic map.
Read more |