Self-organization, a process where some form of overall order arises out of the local interactions between parts of an initially disordered system, was discovered in cybernetics by William Ross Ashby in 1947.[1][2] It states that any deterministic dynamic system automatically evolves towards a state of equilibrium that can be described in terms of an attractor in a basin of surrounding states. Once there, the further evolution of the system is constrained to remain in the attractor. This constraint implies a form of mutual dependency or coordination between its constituent components or subsystems. In Ashby's terms, each subsystem has adapted to the environment formed by all other subsystems.[1]
The cybernetician Heinz von Foerster formulated the principle of "order from noise" in 1960.[3] [4] It notes that self-organization is facilitated by random perturbations ("noise") that let the system explore a variety of states in its state space. This increases the chance that the system will arrive into the basin of a "strong" or "deep" attractor, from which it then quickly enters the attractor itself. The biophysicist Henri Atlan developed such a concept by proposing the principle of "complexity from noise"[5][6] (French: le principe de complexité par le bruit)[7] first in the 1972 book L'organisation biologique et la théorie de l'information[8] and then in the 1979 book Entre le cristal et la fumée.[9] The thermodynamicist Ilya Prigogine formulated a similar principle as "order through fluctuations"[10] or "order out of chaos".[11] It is applied in the method of simulated annealing for problem solving and machine learning.[12]
Wiener regarded the automatic serial identification of a black box and its subsequent reproduction (copying) as sufficient to meet the condition of self-organization.[13] The importance of phase locking or the "attraction of frequencies", as he called it, is discussed in the 2nd edition of his "Cybernetics".[14] Drexler sees self-replication (copying) as a key step in nano and universal assembly.[15] In later work he seeks to lessen this constraint.[16]
By contrast, the four concurrently connected galvanometers of W. Ross Ashby's Homeostat hunt, when perturbed, to converge on one of many possible stable states.[17] Ashby used his state counting measure of variety[18] to describe stable states and produced the "Good Regulator"[19] theorem which requires internal models for self-organized endurance and stability (e.g. Nyquist stability criterion).
Warren McCulloch proposed "Redundancy of Potential Command"[20] as characteristic of the organization of the brain and human nervous system and the necessary condition for self-organization.
Heinz von Foerster proposed Redundancy, R = 1 − H/Hmax, where H is entropy.[21][22] In essence this states that unused potential communication bandwidth is a measure of self-organization.
In the 1970s Stafford Beer considered this condition as necessary for autonomy which identifies self-organization in persisting and living systems. He applied his viable system model to management. It consists of five parts: the monitoring of performance of the survival processes (1), their management by recursive application of regulation (2), homeostatic operational control (3) and development (4) which produce maintenance of identity (5) under environmental perturbation. Focus is prioritized by an alerting "algedonic loop" feedback: a sensitivity to both pain and pleasure produced from under-performance or over-performance relative to a standard capability.[23][full citation needed]
In the 1990s Gordon Pask pointed out von Foerster's H and Hmax were not independent and interacted via countably infinite recursive concurrent spin processes[24] (he favoured the Bohm interpretation) which he called concepts (liberally defined in any medium, "productive and, incidentally reproductive"). His strict definition of concept "a procedure to bring about a relation"[25] permitted his theorem "Like concepts repel, unlike concepts attract"[26] to state a general spin based principle of self-organization. His edict, an exclusion principle, "There are No Doppelgangers"[27][24] means no two concepts can be the same (all interactions occur with different perspectives making time incommensurable for actors). This means, after sufficient duration as differences assert, all concepts will attract and coalesce as pink noise and entropy increases (and see Big Crunch, self-organized criticality). The theory is applicable to all organizationally closed or homeostatic processes that produce enduring and coherent products (where spins have a fixed average phase relationship and also in the sense of Nicholas Rescher's coherence theory of truth with the proviso that the sets and their members exert repulsive forces at their boundaries) through interactions: evolving, learning and adapting.
Pask's Interactions of Actors "hard carapace" model is reflected in some of the ideas of emergence and coherence. It requires a knot emergence topology that produces radiation during interaction with a unit cell that has a prismatic tensegrity structure. Laughlin's contribution to emergence reflects some of these constraints.[28]
Original source: https://en.wikipedia.org/wiki/Self-organization in cybernetics.
Read more |