How Can There be Order in Randomness
How Can There be Order in Randomness
Thomas Pyncheon, in his sprawling novel Gravity’s Rainbow (1973), exhibited a fascination for the peculiar mathematics of the Poisson Distribution, a pattern exhibited in certain random sequences (including the location of German rocket strikes in London during WWII). Sometimes referred to as a “law of rare events”, the Poisson distribution has proven to apply to such disparate phenomena as: the volume of Internet traffic; deaths per year in a given age group; DNA mutations resulting from radiation; goals scored in sports with two competing teams; etc.
Emergence
How is it that mathematical patterns exist for random events? This is the puzzle explored in the mathematics of complexity, a series of related fields that play an increasingly important role in the study of complex systems including physics, biology, economics and sociology. For an excellent, non-technical resource, I highly recommend Melanie Mitchell’s 2009 book – Complexity: A Guided Tour. More recently, Ms. Mitchell posted an article on “Complexity” in The Templeton Foundations Big Questions Online series where you can see both the article and the exchange of comments. <unfortunately, this material is no longer posted>
Quite remarkably, complex systems often exhibit self-organizing behavior, arising from the independent and uncoordinated actions of the entities that comprise that system. For example, an army ant is an individual organism with fixed and quite simple behavior patterns. Yet in a colony of millions of individual army ants, intelligent and sophisticated adaptive behaviors appear spontaneously at the collective level, without benefit of either top-down control or intelligence at the individual level. There are no generals in this army, yet the behavior appears coordinated. The behavior is referred to as “emergent.”
How does intelligent behavior appear in a system with no apparent command? The answer seems to be in the way the system explores for information about the environment and then adapts based on that information. For example, food gathering by an army ant colony relies on the foraging behavior of individual ants. Ants on a scouting mission will begin searching in random patterns, but if one finds food it will return straight to the colony, leaving a pheromone trail. In addition to random searching, ants will also sometimes switch to following pheromone trails – tending to reinforce successful pathways to food. Through a balance of random and following behavior by a large number of individual ants, a complex map of food sources and pheromone trails develops. This map is not in the mind of any of the ants, but in the collective behavior of all the ants in the colony, as if the colony itself had a mind.
Similar complex behaviors are demonstrated in the human immune system as it responds to infection, and in cellular metabolism as the soup of organic chemicals responds to changing conditions inside and outside the living cell. In all these cases, the processes that we observe appear to be finely tuned, reflecting what must be a long and incredibly complex evolutionary history. The Darwinian explanation for these emergent phenomena is that incremental changes or mutations at the component level are tested against the environmental conditions (the fitness landscape), and the optimal changes are selected for by reproduction. In short, the behaviors arise as a result of the evolutionary benefits to the colony, or the body, reinforced through a selection process that finely tunes the system over many, many generations.
Interconnection
Another feature of many complex systems, including the three examples in the last paragraph, is that they can be viewed as a network of connected parts. Army ants operate as a network of foragers – the immune system as a network of cells – the metabolic system as a network of chemicals. As the individual parts bump into each other, they communicate, creating a network of nodes and channels. There are two characteristics of networks that commonly develop in nature. First, they are what are known as “small world networks”, which demonstrate a lot of interconnectedness among nodes. This means there is a small average “degree of separation” between any two parts. In these cases, transmission of information from one node to all nodes will be relatively efficient, compared with a network with higher average degree of separation. Second, they exhibit a pattern referred to as “scale-free”. Scale-free networks demonstrate a consistent pattern of connectedness – very few nodes are highly connected and very few are minimally connected. The ratio of connections to nodes in a scale-free network can be drawn as a smooth curve that looks the same on all parts of the graph. The curve is described mathematically in what is known as a power –law distribution (e.g. P = ( 1 / kx ), where P is the fraction of nodes having k connections and x is the scaling factor). Power law distributions seem to be common in both information networks and biological systems.
Melanie Mitchell presents a striking example of a power law distribution and how it develops in discussing the question of metabolic scaling. Put simply, metabolic scaling is the observed phenomenon that larger animals have slower metabolisms. Naturally, we would expect this result and intuitively recognize that as body mass increases, surface area increases, but not as fast. Specifically, surface area increases as the square of height and body mass as the cube – a power law ratio of 2/3 would result from a simple geometrical relationship between the two. Yet the expected power law exponent of (2/3) for metabolic scaling proves to be incorrect – the measured result is (3/4). Larger animals are able to sustain a higher metabolism than one would expect. Why is this the case? After considerable sleuthing, researchers realized that the fractal branching pattern of capillaries in the lungs produces a higher oxygen uptake than mere surface area. Fractal structures (a related branch of mathematical complexity) are able to maximize how much surface area can be packed into a three dimensional volume.
It seems quite remarkable to find these odd pieces of mathematical complexity theory including fractals, power law distributions, scale-free, small-world networks, and self-organizing behaviors throughout the different branches of science. These are beautiful examples of what Eugene Wigner in 1960 called “the unreasonable effectiveness of mathematics in the natural sciences”. Physical reality seems to conform to mathematical principles or, to express it differently, mathematics seems to constrain the forms which physical reality displays. In this sense, then, mathematics is a metaphysical necessity of the physical universe. (see: The “God-like” Puzzles of Mathematics)
Conclusion
While the mathematics of complexity has been helpful in identifying and describing the mathematical forms that emergent systems exhibit in many fields, it does not explain why the world demonstrates such an inexorable and universal directionality towards diversity and increasing complexity. Just as physicists cannot explain why the universe follows the arrow of time towards higher entropy and decreasing order, neither have complexity theorists offered an explanation for why the universe displays an inexorable (and seemingly contradictory) path towards increasing diversity and complexity in highly ordered patterns.
We are not going to solve that problem any time soon. The enormity of these questions suggests that we will need to develop an entirely new conception of the fundamental nature of reality before we will be able to explain or understand the phenomena we are observing.