A fundamental similarity between the evolution of living systems and computing systems is the way they become ever more complex. We know what that means at an intuitive level, but just what does it mean in a more formal sense?
There have been serious attempts by heavyweights such as Andrei
Kolmogorov, Gregory Chaitin, Charles Bennett, and Stephen Wolfram to
define complexity in a general, rigorous, and formal way. None of those
attempts has
been completely successful. We tend to think of complexity as more
intricacy than we can
comprehend. That sort of complexity is often called
detail,
structural,
or static complexity. Depending as it does
on the limits of our understanding, the definition of static complexity
is inextricably confounded with the definition of human cognitive
abilities, such as intelligence. But intelligence itself is
notoriously slippery and difficult to define. Another sort of
complexity, typically called dynamic complexity, is inherent in the
systems themselves, not in the limits of human comprehension of their
parts. This second sort of complexity
emerges naturally in systems that evolve over time via positive
feedback, e.g., meteorological,
cosmological, biological, ecological, geological, social, economic, or
computing systems. (For more on these two sorts of complexity,
see here.)
Given that we can't satisfactorily define
complexity, it should come as no surprise that we cannot satisfactorily
measure
complexity in a general way either. So, what could it mean to say that
evolving complex dynamic systems have a habit of becoming more and more
complex? Without stepping into the deep waters of trying to define
complexity,
let me say that such systems become less and less predictable without
becoming random. [Note: we also lack satisfactory definitions and
measures of
randomness so I use that term somewhat loosely too.]
Both static and dynamic complexity bedevil us in our computing
systems. We struggle with the static
complexity of program source code and database schema that define
structural relationships between networks of interacting elements. These structural descriptions
become so intricate that they exceed our cognitive ability to
understand
them. Dynamic
complexity is qualitatively different; it is about what
happens at runtime as the behavior of a system, e.g., a computer
program, unfolds via interactions between elements. That
is, dynamic complexity arises from feedback loops between interacting
elements. Consider a flock of starlings
for example. The birds attempt to stay
in the flock while avoiding collisions with other birds.
The turns and twists of each bird to satisfy these
goals affect the paths of many nearby birds that, in turn, affect the
paths of still more birds. Thus the
dynamics of the flock as a whole are complex and inherently
unpredictable. Yet anyone who has watched flocks of starlings can
see clearly that the behavior of the flock is not random. Note that
this sort of
complexity has nothing to do with human cognitive limits. It arises
from the distributed nature of the task the birds face
and the different perspective each bird has.
Emergent systems such as flocks of birds, economic trading systems, groups of human dwellings, or a community of websites or blogs -- become new "entities" i.e., the flock, the stock exchange, the city, or a community in the blogosphere. These new entities then begin to interact with each other. Cities, for example, compete with each other for status, pride, educated workforces, new businesses and even land. Eventually their interactions grow rich enough that yet another, higher level complex system can emerge. Cooperating cities form countries, and so forth. These new sorts of interactions add yet another level of complexity (unpredictability) without becoming more random.
As we will see, the consequences of emergent multi-level complexity turn out to be central to understanding both computing and biological systems.
Contact: sburbeck at mindspring.com
Last revised 6/7/2012