In complex, dynamic, adaptive systems, sets of "peer" elements
interact with one another in ways that tend to form stable, mutually
reinforcing relationships. Examples are commonplace in ecologies,
societies, economies and geographies and, of course, in biology. The mechanisms by
which new levels emerge are difficult to discern in such large,
complex, slow evolving systems. Computing, however, is one
exception. At least it was for its first fifty years during which it
evolved quickly, observably, and ostensibly "by design." Then software
"viruses" emerged that could spread through floppy disks. Now
they spread through email, Facebook, Twitter, and other social media
via the Internet. They have become a part of, rather than separate
from, economies, societies, and even international affairs. Hence the
emergence of new levels of complexity in the digital world of bits has
merged with those of the more familiar world of atoms.
An element at one level may play roles in more than one tightly interconnected set. For example, individual humans participate in many social sets, e.g., families, companies, clubs, markets, tribes, nations, etc. Each social group is distinct from the others and relies on different kinds of human interaction. Yet the various social groups have linkages because the individuals they share act as bridges between them. Such links are not confined to human social systems. In general, if elements in a complex system participate in multiple higher-level sets, there will be subtle linkages between the sets.
Multi-level emergence is especially important to understanding both multi-level digital and multi-level biological systems. Multi-level emergence poses severe, sometimes insurmountable, challenges for the predictability and manageability of the resulting systems. Each new level obeys its own "operating" principles that, at each subsequent higher level, operate at a larger scale, at a slower pace, and increasingly divorced from those of the base elements, whether those elements are atoms or bits. Understanding or predicting the behavior of higher level systems in terms of their underlying atoms or bits quickly becomes impossible. Human behavior is simply not explainable in terms of atoms nor is the Internet explainable in terms of bits.
Cause and effect in multi-level complex systems can cross levels with quite unpredictable consequences. Whether systems were designed or have evolved, they tend to use encapsulation mechanisms to restrict unwanted interactions that otherwise tend to devolve to chaos. However for fundamental reasons, encapsulation cannot tame all "unwanted" interactions. Most of the biomolecules inside a living cell are isolated from the external environment by cell walls and often further isolated within internal cellular organelles or by attachment to particular subcellular structures. Yet some biomolecules must be able to pass through the membranes for the cell to function. Once a molecule has crossed an encapsulation barrier, it interacts with a different set of other molecules which imply different consequences. Similarly, modern computing hardware and software create various barriers to prevent the inappropriate movement or execution of code or data. Firewalls, for example, seek to prevent inappropriate interactions between computers over Internet connections. That is not always possible because some information must be communicated across these barriers. Joel Spolsky's Law of Leaky Abstractions points out that in the world of software levels of abstraction are never foolproof.
Simply put, it is difficult if not impossible to predict the behavior of multi-level systems -- and most interesting systems are multi-level. The details of a hurricane or tornado are fundamentally not explainable by invoking the physics of individual air and water molecules[1]. Nor can a computer be understood by pondering the behavior of electrons or the interconnections between logic gates. Similarly, the behavior of even a single cell cannot yet be predicted by understanding its chemistry at a molecular level. We intuitively recognize these limits and their consequences for predictability in business, social, economic and ecological spheres as well as biology and computing.
Occasionally, our inability to predict and manage multi-level phenomena becomes explicit and has profound consequences. Pharmaceutical drug discovery is a biological area in which we face many levels of complexity between the biomolecular “cause” and the “effect” on the whole organism. We seek to find a small-molecule, i.e., drug, that modulates some intracellular chemical pathway in a way that desirably affects the human body (or even the human mind) yet does not affect any other cellular process in a deleterious manner. The many levels of complexity between cellular chemical reactions and whole-body effects and side-effects make it very difficult to find new drugs and even more difficult to determine that they are safe and effective. Similarly, in the computing realm, multi-level interactions account for some of the most recalcitrant bugs we face. Perhaps the most egregious example in computing is the prevalence of buffer-overrun exploits. A buffer-overrun takes place, essentially, at the machine language level; the necessary code for array-bounds checking translates to perhaps a dozen extra machine instructions. Yet the effects of buffer-overruns can be Internet-wide. The “SQL Slammer” denial of service worm that slowed or blocked 80% of all Internet traffic for a few hours January 25th, 2003, was due to a buffer-overrun in Microsoft SQL Server software. One of the largest viral botnets in the world, known as Bobax, "...bores open a back door and downloads files onto the infected machine, and lowers its security settings. It spreads via a buffer overflow vulnerability in Windows, and inserts the spam code into the IE browser so that each time the browser runs, the virus is activated. (see here)." Sadly, much of the spam that bedevils the entire Internet is due to incompetently written code that allows buffer overflows in various popular systems.
There are cognitive limits to our ability to trace cause-and-effect through multiple levels. We can focus on the elements of one particular level and contemplate the “part/whole” relationships in two directions. For example, we can think about how the elements, say molecules or machine instructions, are made up of atoms or orchestrated gates and how the properties of those atoms or gates affect the properties of the molecules or instructions. With a switch of cognitive repertoire, we can also contemplate how the molecules interact with other molecules to form crystals or various larger biomolecules. Similarly we can contemplate how machine instructions cooperate to carry out a function or subroutine. A skilled practitioner - a chemist or programmer - may be able to think about interactions in both directions at once. That is, a skilled practitioner may be able to reason simultaneously about three levels of abstraction. But it is quite difficult and seldom even useful to do so. Sustained reasoning about four levels of abstraction at one time is, arguably, beyond the ability of the human mind.
As we have seen repeatedly in the evolution of computing, interactions between levels can constrain the possible evolutionary paths of adjacent levels. Every level other than the “highest” (i.e., most recently evolved) level of abstraction is at least somewhat constrained to evolve more slowly because changes that invalidate an implicit “contract” with the higher level tend not to survive long. In biology, single-cell organisms and viruses can mutate rapidly because their behavior is free from a higher-level constraint. Individual cells within multi-cell organisms are not so free to explore new behavior. Similarly, unconnected digital devices, PDAs or cell phones for example, change at a dizzying pace whereas PCs become more powerful and cheaper but must generally retain the ability to run almost all applications from the prior generation. In multicellular computing, we see most innovations happening at the highest level where new kinds of collaboration emerge -- new programming practices such as AJAX, new mashups in Web2.0, Google's AdSense ecology, etc. These innovations seldom require changes to the underlying ecology of computers, operating systems, or Internet/Web protocols. Those that do, tend to fail.
So, now, while multicellular computing is in its relatively early stages is the time for rampant innovation. Also it is the time when architectural decisions -- good or bad -- will have the most effect in steering the future evolution of multicellular computing. The architectural principles upon which new systems are based will tend to settle into accepted patterns. Those systems that already support the winning principles will be more likely to prevail.
[1] Weather and climate are chaotic systems, hence inherently unpredictable beyond short-term extrapolations of current states.
Contact: sburbeck at mindspring.com
Last revised 6/8/2012