The evolution of computing is vastly more rapid and better
known than the
evolution of life.
Modern digital computing began early in WWII to facilitate tasks
such as code breaking that required previously impractical
volumes of tedious and error-prone arithmetic computations.
These first computers executed the necessary sequence of logic
and arithmetic operations as directed by hard-wired plug-boards,
so the "programs" were hardware too. Digitally
programmable "stored
program" computers first appeared in 1949.
Programs-as-data stored in addressable memory allowed software
and hardware to evolve somewhat independently. Thus the art of
programming was born. The central abstraction of storing program
instructions as data not only changed hardware architectures, it
also introduced the notion that bits can encode something other
than numbers.
The circuits underlying hardware representations of bits
evolved from relays to vacuum tubes to transistors to integrated
circuits and in some cases to optical circuits. This
progression created ever smaller and denser circuitry that
increased both speed and storage capacity by orders of
magnitude. Today even smart phones contain multi-core
CPUs. Large organizations coordinate tens or even hundreds of
thousands of processors in massive server farms. Programs such
as those that implement Google's page rank algorithms may run in
parallel on hundreds of thousands of CPUs. "Permanent"
storage evolved similarly. In 1980 a large mainframe might
have had a few Gigabytes of spinning disk online. Now a teenager
may have ten times that in his jeans pocket. Human
interaction with computers advanced dramatically as well. It
began with plugboards and panel switches, then used teletypes
and video monitors, and has since progressed through mice, touch
screens (with "retinal" resolution) and now to voice-commands
and gestures visible to the computer's camera or accelerometers.
Programming has evolved rapidly too. The "meaning" of the bits
in stored programs co-evolved with new hardware features such as
processor interrupts, pointers, register sets, floating-point
processors, segmented memory and eventually virtual memory. To
manage the increasing richness of hardware capabilities,
programmers created code abstractions easier to understand than
binary machine language: first assembly languages and then
compiled or interpreted languages. Compiled COBOL appeared
in 1959, followed in 1960 by FORTRAN and soon thereafter LISP
and ALGOL In the early 1970's, C and object-oriented
languages, such as Smalltalk and Scheme emerged. Programming
languages continue to evolve today because of the new demands
created by the interpenetration of computers, society, and the
Internet.
A more dramatic and important evolution occurred in what the data bits
represented. Initially bits represented binary
numbers. By the 1960's it became common to also use groups
of bits to encode characters that could be entered and printed
via teletype machines. By 1963 both ASCII and EBCDIC codes were
standardized and FORTRAN, COBOL, and LISP were able to deal with
character and string data types in addition to numbers. Text
processing became practical. As Unix machines and PCs became
commonplace, character string data overtook numeric data as the
most common domain of computing. Now, character data is likewise
outweighed by image, audio and video data. And an
increasingly wide array of sensors and actuators for all sorts
of physical properties -- pressure, temperature, acceleration
(in our iPods and iPhones), chemical sensors, radio, GPS
signals, etc -- generate and consume data. Now mobile
devices can know where they are in the physical world and soon
will be able to recognize by voice and face recognition the
people nearby.
Computers not only interact with people and other computers,
increasingly they interact directly with the physical world.
They control real-world processes such as traffic lights, car
engines, chemical plants and nuclear power plants. They
fly planes and drive cars. They control various robots (now
fairly primitive, but...). And computer
controlled 3-D printers turn digital models of objects
into actual objects.
The Web, has spawned all sorts of emergent multicellular
computing constructs such as infectious viruses and
worms, search engines, multi-player Internet games, peer-to-peer
networks, wikis, blogs, social networking sites, folksonomies,
photo sharing, Skype, Web
Services, mashups, and Web 2.0.
We also witnessed the growth of Cyber crime and now Cyber
warfare,
e.g.,
via Stuxnet and Flame.
In the stratosphere of Cyberspace we find "The Cloud." Of
course, the Cloud is merely a marketing metaphor. Nonetheless,
it is a metaphor that provides real Internet connected services
specialized for attracting mobile devices and their cyborg human
partners. So the Cloud "serves" hundreds of millions of
cyborgs. And therein lies a lucrative space to be
exploited by the most adroit services. Each "Service"
tends to be nominally free, but it is "monetized" by selling the
information about the cyborgs, their friends and other relations
(real as well as are virtual) gathered in the process of
providing the service. The very real and masive warehouses
of "Big Data" computers that provide such services also watch
over analyze and permanently remember virtually all interactions
in this "Cloud". The Cloud knows all, sees all, and remembers
all. Only angels are truly safe in The Cloud.
We've come a long way, baby!
What is amazing is that the evolution of the “virtual” world occurred so rapidly. Modern high-tech human societies are the result of more than 13 billion years of step by step evolution from the first atoms. Yet a mere 70 years after the first true digital computer, humans are beginning to be influenced as much by bits as by atoms. The evolution of life and the evolution of computing are merging, bringing the complexity of both realms together in completely unforeseeable ways.
Contact: sburbeck at mindspring.com
Last revised 7/21/2013