Computers are increasingly specialized both in hardware and in software. One might even say they are becoming differentiated in a manner analogous to biological cells. Routers are specialized for bandwidth. Data base servers are specialized for I/O, caching, and query. Web servers are specialized for small transactions and throughput. High power parallel processign engines such as IBM’s 'Watson' are specialized for massively parallel Hadoop operations. Portable devices such as PDAs, cell phones, and MP3 players are specialized for very simple user interfaces, low power and long battery life. Game boxes are specialized for rapid graphics calculations. And the many embedded devices such as those in cars are further specialized for reliability, durability, precise real-time control and the like.
Specialization in computing is possible because the various roles computers play in modern life have become more specialized. The role of a router is nothing like the role of an iPhone. Specialized roles allow manufacturers to produce software or hardware that is best suited to the specific roles. The computing industry continues to evolve to provide more options to support more specialized needs at different costs. For example, the specialized needs of a game box have driven the development of very fast graphics processors. Market forces have brought forth CPU chips that vary in cost from less than a dollar to several hundred dollars depending upon speed, function, power usage and the like. Different types of memory, disk, display, and communications are also available at different prices.
The growing multicellularity of computing also facilitates
specialization.
A specialized computer can rely on others via network
connections for services it does not
provide itself.
No longer does one computer have to do everything. That's the
point of
Service
Oriented
Architectures
(SOA).
Specialization in computing is necessary for three reasons. First, many of the specialized requirements are incompatible. A PDA or cell-phone, for example, must run on minimum battery power whereas a computation engine must expend power with abandon in order to maximize FLOPS. Second, excess generality, especially in software, imposes the excess costs of a larger software engineering team. Software that supports a larger than necessary set of functions also lengthens time-to-market, is almost inevitably more buggy, and thus requires a larger customer support staff and risks customer dissatisfaction. Finally, the more function a system supports, the larger the “surface area” exposed to attack by viruses, worms, spyware, sniffers, and other malware. It is no accident that most "malware" enters our systems by first infecting a general-purpose end-user PC.
At first blush, there appeared until recently to be one glaring
exception to
increasing specialization in computing: Windows PCs attempted to
support all
possible function. Windows was the pinnacle
of
single-cell
computing, not the basis of multicellular computing.
The
cost in increased
complexity
and
generality nearly crippled Windows Vista which was shunned by
most
corporate IT
departments.
Yet Windows still boasts a rich ecology of third-party
applications
that rely on
the Windows marketplace as well as the Windows API. A host of
Windows
programmers
have, at great cost in time and effort, learned how to use the
Win32
API set. While protecting their livelihood, these programmers
helped to perpetuate
the
Windows
software ecology and discourage new competitors. So the
transition from
Windows
to more specialized personal computers has been slower than it
otherwise
might have been. It is happening nonetheless. More specialized
devices
such as iPads, iPhones,
Android
phones, and other smartphones continue to make Windows desktops
and
laptops
less critical to business. Outside of business, especially among
the young
and hip, Windows boxes seem hopelessly antique. And the "App"
ecologies
for iPhones and Android smartphones has drawn the lion's share
of new
developers. That's where the money is now.
Microsoft's ambitions have not been the only factor opposing specialization. Biological systems are self configuring, self protecting, and self optimizing in ways that computing systems are not. IBM’s efforts to create "autonomic" computing systems were an attempt to redress this deficiency. But creating truly self configuring systems is a difficult task. So we still must configure, provision, and maintain our systems by the mostly manual efforts of IT professionals (or gifted amateurs). The more specialization in computing, the more specialization will be required in hard-pressed IT staffs. Until computers are primarily self configuring, corporate staffing inertia will continue to work against specialization.
Economic forces and user needs also tended to counter specialization. A few years ago there was a flurry of interest in what then was called “Thin Clients.” That meant, in effect, a replacement for the ever-present Wintel PC that would be specialized to support the needs, and only the needs, of the knowledge worker. Unfortunately, it turned out to be much harder than expected to decide what needs were common to such workers. More recently we saw the rise of the Netbook, a very inexpensive PC-like laptop somewhat specialized for people who primarily wish to browse the web. That market was expanding until the iPad and iPhone (and its imitators) captivated the public. Now the most popular innovations in mass-market computing emerge first in the iPad/iPhone market rather than the Wintel ecology. Microsoft struggles to keep up.
With Microsoft's death grip on mass-market software broken,
innovation is
blossoming. One
result is that specialization is accelerating in most, if not
all,
areas of computing, especially in small low-power personal
computing
devices and special-purpose sensor/effector devices. Worldwide
shipments
of new cellphones, PDAs and smart-phones already far outstrips
those
of Wintel boxes. A one-size-fits-all operating system simply
cannot
mutate
fast enough to keep up.
Server computing is specializing too. Within the
corporate IT
world, the needs of large-data analytics are beginning to come
to the
fore. And the huge Web server farms such as Google rely on
stripped-down cheap boxes running specialized operating systems
often
based on Linux. It is a new world in which one size does not fit
all.