Well engineered systems reflect a number of accepted principles of good design. The parts in engineered systems have known function, irrelevant parts are removed, redundancy is explicit, and designers attempt to maintain separation of concerns, i.e., a part participates in just one functional unit and is designed to do it's one task well. Engineers do everything possible to prevent the emergence of unforeseen consequences (bugs). In contrast, parts in evolved systems may have mysterious interactions with several apparently separate functions, and they may not seem to be optimized for any of those roles.
Software systems, especially large or old software systems, owe much more to evolution than we sometimes wish to acknowledge. Complex computing systems may start with good engineering but all too soon the best intentions give way to expediency. Changes accumulate by accretion of small modifications, each one intended to fix some bug or add some small function or change an existing function. Inevitably, unintended consequences accumulate as well. As they age, computing systems begin to resemble biological systems.
Three and a half billion years of evolution has given rise to living systems that are incredibly elaborate. Complex biochemistry begets complex cells that underly complex organisms[1].
Software professionals might characterize biological systems as a triumph of layer after layer of “clever hacks.” Each new layer exploits the hacks that have come before. So, biological systems include vestiges of functional units that may be irrelevant to current circumstances -- or maybe not. These vestiges are left over from the ancestral history of the organism. Some may still be functional in unusual circumstances (e.g., rarely used metabolic functions). Others, such as large segments of the human genome might have no function. Yet the closer scientists examine non-functional "junk" DNA, the more of it turns out to have a function after all.
To any IT manager, especially those who were around during the late '90s, the above should sound very familiar. The history of computing may be relatively short but, as we learned in the Y2K (Year 2000) experience, it is long enough for legacy computing systems to be full of obscure code that may or may not be relevant to current circumstances.
All complex evolved systems, be they biological, social, economic, or computing systems, change over time as a result of the interaction between various sources of novelty and various mechanisms for weeding out “undesirable” or “unfit” novelty. In biological evolution, novelty is presumed to occur by various random processes, and weeding out occurs when an organisms does not survive long enough to produce offspring. Computing systems also evolve. Novelty in computing usually arises from human creativity. Computers are general purpose modeling tools so there are always new ways computers can be used, new ways for them to interact, and new architectures for their design and construction. Weeding out happens when the novel uses simply don’t work, or don’t scale. But most often novelty in computing fails simply because the marketplace rejects it.
Evolution of a given complex system, whether a biological or a computing system, occurs in the context of other evolving systems. That is, these systems co-evolve with other organisms or computing systems that collaborate, compete and/or cooperate with each other. Biological co-evolution in predator/prey or symbiotic relationships tends to drive evolution more rapidly, something that should sound familiar as we cope with today’s co-evolutionary arms race between computing virus/worm predators and their Windows prey. The interplay between email spammers and spam filter developers is another obvious example of digital co-evolution.
One important lesson from biological co-evolution that is only now being absorbed by computing professionals is that monocultures, i.e., large populations of genetically identical organisms such as corn fields or rubber plantations are big, stationary targets for diverse evolving predators. Once any virus, bacteria, fungus or insect manages by mutation to escape the defenses of one plant in the monoculture, all plants are immediately at risk. The Irish potato famine in 1845-50 is an unfortunate example of what can happen. To our dismay, we are discovering that this same principle applies equally well to computing monocultures such as the Windows-Office-IE-Outlook monoculture. Exploits that attack a weakness in such monocultures spread like wildfire.
[1] “The minor transitions in hierarchical evolution and the question of a directional bias,” McShea, D. W., J. Evolutionary Biology, Vol. 14, pp. 502-518, Blackwell Science, Ltd., 2001.
Contact: sburbeck at mindspring.com
Last revised 6/8/2012