Most evolution is gradual. Most biological species live surrounded by other species making their living in somewhat similar ways. Individuals belonging to one species in such environments survive competition with individuals of their own and other species by doing one or a few things a little bit better than their competitors. Each species has its own ecological space, or niche, defined by aggregate details of the ways its make their livings. Each individual’s genetic heritage is tested against the environment including other individuals of its own and other species. Those individuals whose heritage better prepares them to survive and reproduce in the competitive environment are more likely to successfully pass on their genetic heritage or “knowledge” of what has worked in practise to the next generation. Equally obviously, unsuitable genetic heritages are less likely to be passed on. Natural selection of this type over many generations may cause a species' average capabilities, and thus the ecological niche it occupies to evolve through the gradual accumulation of new genes and changes in the frequencies and combinations of existing genes.
However, in some cases, small incremental changes over many generations may reach a point of instability or opportunity enabling a significant change in the way a species makes its living; e.g., when the small changes allow the species to cross some kind of geographic or ecological threshold into a new adaptive zone not available to its current competitors. Having crossed the threshold, such a pioneering species may begin to evolve rapidly to exploit the environment in ways not available to competitors that have not crossed the threshold. Once the threshold is crossed, the species' population may grow explosively and (at least on an evolutionary time scale) fragment into a large number of new species specialized in new ways to more effectively exploit the new kinds of niches21. The shift into a new adaptive regime is called a “grade shift”22.
We humans have reinvented our roles in the ecosphere a number of times since we became a species separate from the anthropoid apes, and have evolved in increasingly revolutionary directions away from the comparative placidity of our largely herbivorous human relatives. The first reinventions in our ancestry were almost certainly evolutionary, probably involving genetic changes taking place over many hundreds or thousands generations (i.e., tens or hundreds of thousands of years). However, as humanity evolved, an ever increasingly greater percentage of our tested and accumulated World 3 knowledge about the world has been passed on culturally rather than genetically. Consequently, our evolving ecological relationships to the rest of nature have changed from a slow sequence of incremental changes depending on genetically based natural selection; to increasingly rapid, abrupt and radically revolutionary transformations in our cultural heritage now taking place in less than a generation. Changes of incomprehensibly great magnitude in the relationship of the human species to the rest of the physical and biological world now occur in less than a decade by comparison to the tens or hundreds of thousands of years required for genetic evolutionary change.
Karl Popper (1972 p:238-9) recognized the fundamental difference between animal evolution and human evolution that I will elaborate here:
Animal evolution proceeds largely, though not exclusively, by the modification of organs (or behaviour) or the emergence of new organs (or behaviour). Human evolution proceeds, largely, by developing new organs outside our bodies or persons: 'exosomatically', as biologists call it, or 'extra-personally'. These new organs are tools, or weapons, or machines, or houses.
The rudimentary beginnings of this exosomatic development can of course be found among animals. The making of lairs, or dens, or nests, is an early achievement. I may also remind you that beavers build very ingenious dams. But man, instead of growing better eyes and ears, grows spectacles, microscopes, telescopes, telephones and hearing aids....
Yet the kind of extra-personal or exosomatic evolution which interests me here is this: instead of growing better memories and brains, we grow paper, pens, pencils, typewriters, dictaphones, the printing press, and libraries.
These add to our language--and especially to its descriptive and argumentative functions--what may be described as new dimensions. The latest development (used mainly in support of our argumentative abilities) is the growth of computers.
I will consider nine revolutions that have enabled grade shifts that have reinvented the nature of humanity. Four involve the invention/adoption of new kinds of technology, i.e., the development of tools external to the human body itself. Five relate to less tangible mental cognitive revolutions in the way humans gain and use non–genetic knowledge. The two types of changes are often synergistic. Some time-lines that situate the technological revolutions in history (Barger 1996; Nunberg & Brownstein ????; Lee et al. 1996). The impact of these changes on the nature of humanity and organizations23 will be a major thread in weaving the theme of my Subject.
Humanity's first technological revolutions took place at an evolutionary pace long ago in our fossil past as a distinct species and accounted for the first major grade shifts. We don't know the details of where or when, but we know they occurred, because they are key features in differentiating our ancestors from their primate relatives, and because they opened totally ways of making a living in the natural world not exploited by any other primate.
Use of clubs, stones (throwing and cutting), levers and fire to extend human reach and metabolism beyond anatomical and physiological limits
Most non–human animals interactions with their surroundings are limited entirely to the capabilities of their physical anatomy: teeth and claws for defense and feeding, limbs for movement, hair and integument for insulation and protection from the elements, etc. Our close relatives, gorillas and chimpanzees, and occasional bird species such as some ravens and Galapagos finches use sticks they pick up (and shape and carry to places where they are used) to tease tasty insects out of holes. Our close relatives, the anthropoid apes have gone beyond this, to use and make a variety of simple tools (Whitten et. al 1999; Pennisi 1999). However, more than 2.3 million years ago our hominid ancestors began to make even more sophisticated tools (Roche et al 1999; Steen 2000). The evidence suggests that by this time proto–humans were consistently making complexly shaped tools. There is some evidence that these tools enabled the hominids to successfully exploit food sources considerably beyond the anatomical and physiological capacity of their very close relatives. The proto–humans' dietary changes may have physiologically facilitated the development of larger brains (Wrangham, et. al. 1999; Wood & Brooks 1999; Sponheimer, M. & Lee-Thorp 1999), and the requirements to manage and learn to use the new tool kits may have fed back into the evolution of more brainpower and language.
Ropes and digging implements used to control and manage non–human organic metabolism for human benefit (agriculture and transport) 10,000 years ago
Archeological evidence suggests that the next major reinvention of humanity took place towards the end of the last Ice Age – probably more than 10,000 years ago, with the development of agriculture and a sedentary life enabled by ever more sophisticated tools (Lewin 1988; Pringle 1998, 1998a). The first tools were probably of wood, stone, bone, sinew and skin, but the demands of the new agricultural existence eventually led to the development of more effective metal tools some 5,000–6,000 years ago24. Metallurgy enabled the increasingly sophisticated Bronze and Iron Ages, but did not seem to fuel another grade shift in cognition. It was not revolutionary in the sense that the development of agriculture was. Metallurgy only enabled better agriculture and building, maintaining and defending larger villages.
In the Agricultural Revolution, tools helped to subdue and control nature. However, except for smelting involved in making metal tools and cooking, human ecology was still based almost entirely on controlling the flow of external energy via plant and animal metabolism. The Industrial Revolution fundamentally changed that relationship and enabled a grade shift of even greater magnitude than that of the agricultural revolution. The Agricultural Revolution probably took place over a time scale of centuries or even millennia. The pace of the Industrial Revolution was measured in decades.
Key inventions that started the Industrial avalanche were Thomas Newcomen's steam engine25, invented in 1712, for pumping out coal mines and James Watt's substantially improved the steam engine (Thurston 1878; Carnegie 1905) around 1765, which used heat released by combustion to create the steam. In 1771 Richard Arkwright opened the first spinning mill driven by inorganic power (water).26 Steam engines soon powered all kinds of factory operations including printing presses, and eventually transport. By removing the trophic constraints provided by the limited availability of living resources of metabolic energy, the exploitation of the new non–metabolic energy resources fuelled an unprecedented period of exponential population growth which may only now be slowing down.
Change enabled by the grade shift of the Industrial Revolution continues today, with the development of ever more efficient and powerful energy sources to fuel the expansion of human control of global resources: internal combustion engines, turbines, nuclear power, etc. In 200 years (8 – 10 generations) humanity's populations and industrial activities have grown to the point where we now threaten the very survival of a substantial fraction of organic life on the planet.
The revolutions considered above respectively enabled grade shifts by extending human anatomy and metabolism. The development of computers, in the Microelectronics Revolution, provides tools to extend mental processes beyond the limits of human neurons.
McKinney (2000) provides a detailed timeline28 for the evolutionary development of computing technology. In this, I regard the following events as crucial.
Mechanical data processing began when the Librarian of the US Surgeon General's Office, John Shaw Billings suggested to Herman Hollerith that the processing of the 1890 US Census could be automated using punch cards29. AT&T developed the first electromechanical relay-based digital calculator in 193730 – two years before I was born. ENIAC, the first electronic tube–based computing system started development in 1943, and was given its final conceptual breakthrough – the stored program, by John von Neumann in 1948(Weik 1961). Shockley, Bardeen and Brattain invented the transistor in 1947. Transistors were first applied to digital computers in 1956. The first integrated circuits were developed in 1958–59 (Texas Instruments Inc. 2000), with the first computer based on integrated circuits built around 1961 (Texas Instruments Inc. 2000a). As a student and budding scientist, I had the opportunity to use all the generations of this technology (see Episode 2).
Once computers began to be built with integrated circuits, the technology began to evolve and shrink transistors at a completely unprecedented rate. The number of transistors per chip began to double every year or two. The phenomenon has been well documented as Moore's Law, under the name of the person who first predicted the phenomenon in 1964 (Moore 1997). In 1959 a single (1) good transistor could be purchased for around $6.00. In the year 2000, one could buy 64 megabit SDRAMs encompassing more than 64 million transistors in a similar sized package to the 1959 transistor for around $6.0031. This is an increase in transistor packing density of more than 7 orders of magnitude in less than 40 years (particularly considering that one bit of memory involves several transistor elements). Computer "power" has probably increased even faster than this basic measure, since power is determined by a number of other factors in combination with the cost or packing density of transistors, such as: clock speed, word length, speed of peripheral storage, sophistication of architecture and software and so on. Whatever the the increase in computing power has been, it is incomprehensibly large in terms of most technological or social changes.
The revolutions just reviewed primarily concern apparatus and technologies that exist in the physical reality of Popper's world 1. As will be discussed in the next section, there have also been a number of revolutions in cognition in the evolution of humanity that have enabled profound "grade shifts" in what we are as humans. As the remainder of this work will demonstrate, these revolutions have been primarily responsible for creating Popper's world 3
Cognitive Revolutions include a sequence of basically intangible revolutions in the way human minds acquire, create, store and retrieve information and knowledge32.
The first "revolution" was a purely genetic evolutionary process by which primitive organisms developed nervous systems able to hold memories of their surroundings and and provide a behavioral capacity to learn from experience. How and when this level of organization was achieved is unknown, and not really pertinent to this discussion. However, the fact that our distant ancestors had cognitive facilities comprised of a memory and the ability to add knowledge to that memory by learning was fundamental to all of the other revolutions discussed here. Robertson (1998) guesses an individual ancestral human at this "level 0 civilisation" could encompass 107 bits of information in its brain.
A major enabler for the grade shift of the Agricultural Revolution was that our ancestors acquired a language capability. This allowed one individual to efficiently transfer complex knowledge to his/her offspring or other individuals (Kreger 2000). Because language does not fossilize, there is no way to determine the exact circumstances under which the capacity for language evolved. However, anatomical specializations to support the human larynx, believed to be essential for complex language, only evolved some 150,000 years ago. The first good archeological evidence for the kinds of activities that would seem require speech to organize only appears in the record some 40–50,000 years ago (Holden 1998). With the acquisition of non–genetic capabilities for the transmission down the generations of a complex heritage of experience, the rate of evolution in humanity's ecological adaptations shifted into a much higher gear.
Robertson (1998) calculated that the capability to transfer knowledge by language would increase the amount of information that a single human could hold in memory by at least two orders of magnitude by comparison to what could be remembered without language - or to around 109 bites. The memorization of sagas and learnings and the passage from one generation to the next represents the first tentative origins of Popper's World 3.
The Agricultural Revolution carried with it a need to develop and maintain a complex social hierarchy with systems for remembering and tracking the ownership and exchange of goods and services. The earliest archeological records of counting systems are from sites in Mesopotamia, dated some 11,000 years ago33. These are various clay tokens or tallies that looked like the objects they served to count. By 6,000 years ago, a system was devised to gather the counters together into a clay envelope, where the owner of the items represented by the counters was apparently identified by embossing the envelope with a roll seal. The next step was to emboss the envelope with the shapes of the different kinds of tallies and enumerate them with counting systems. Soon the tallies were dispensed with, and the accounts and contracts were kept simply by marking the clay tablets (Goetzmann 1996). By 5,000 years ago, pictograms expressing more complex ideas were introduced (Joshi 1998), and cuneiform writing developed, followed by alphabetic scripts – all pressed into clay or chipped onto stone (Mouck 2000).
Robertson (1998) calculates this again increased by two orders of magnitude the amount of information a single person could manage prior to the invention of the printing press, i.e., to 1011 bits. World 3 begins to be populated with persistent tangible artifacts as people begin to record their memories in writing.
The invention of papyrus, parchment, paper and ink was certainly an improvement in the cost of labor to chip words onto stone, and undoubtedly helped to increase literacy. However, until comparatively recently, the duplication of writing remained an expensive manual process and literacy was more–or–less restricted to educated royalty or scribes and clerics belonging to conservative priestly classes. Most ordinary people still depended on oral traditions for their cultural heritage, which undoubtedly limited the volume and complexity of the knowledge that could successfully be transmitted. This, in turn, limited the complexity of the technologies they could develop and maintain.
The invention in Europe of moveable type and the printing press in the decade around 1450 turned the production and replication of text documents into one of the first industrial mass production processes34. The first major customers for output from the presses were priests – for the production of holy texts (i.e., Johannes Gutenberg's Bible) and mass–produced indulgences that were very profitably sold to sinners. However, printing presses became more efficient over the next 200–300 years; and as book prices declined, more and more commoners became literate, and individuals who wanted books could afford to buy more of them35. Literate commoners began writing books themselves to record and pass on knowledge they had accumulated from practical experience with the real world (as opposed to the often metaphysical speculations of the priestly class). The increasing volume and utility of knowledge held in widely circulated books and libraries undoubtedly played a crucial role in fuelling the Protestant Reformation (Knox 1999), the Scientific Revolution and the Industrial Revolution (Eisenstein 1979, 1983).
Given the means to record, reproduce and disseminate accumulated practical knowledge, people began to gather collections of published information relating to particular subjects. The concept of scholarly and scientific investigation and reporting known as the "Scientific Revolution" followed on from an increasing interest in observing and reporting on natural and realistic phenomena and the awareness of multiple connections between scientific explanations and reality (Fjällbrant 1996-1997; Dewar 1998; Schement and Stephenson 1998).
Robertson (1998) describes the origin of the Scientific Revolution as follows:
Observation, experimentation, and the discovery of patterns are probably as old as mankind itself. Yet the use of these techniques increased enormously in the sixteenth and seventeenth centuries, so much so that a 'scientific revolution' is widely recognized to have occurred at that time. In other words, following the invention of printing we find, first, a massive increase in the production of information, and then an equally massive development and application of old techniques for refining that information. The scientific revolution of the sixteenth and seventeenth centuries can thus be seen as a necessary and even forced response to the large increase in the production [and distribution]of information that followed the invention of printing. The very quantity of newly produced information forced the development of techniques for dealing with large quantities of information. Those techniques are the processes that lie at the heart of modern scientific methods.
This is, of course, a description of how World 3 knowledge began to grow. As will be discussed in more detail in Episode 1, in the 1660's the first scientific journals began to be published. The rise of scientific publishing and information systems to help retrieve knowledge from the scientific, scholarly and technical literature will be discussed in more detail later in this work.
As printing became increasingly industrialized, the relative cost of books continued to decline until the mid–20th Century when the mass production process was fully industrialized, and literacy became almost a universal human birth right. Robertson (1998) calculates that with humans able to access libraries of recorded information that the information accessible to a single human brain increased by more than 6 orders of magnitude (or to 1011 bits) compared to the pre-printing era of writing. Also, with universal literacy, an ever–increasing proportion of the human heritage has been transmitted down and across the generations, completely extrinsically to the human genome and the lifetime of individual memories.
In the evolution of humanity’s relationship to our environment, the development of science and and technology and the systems to record and retrieve this kind of knowledge represents a huge grade shift in the relative importance of genetic heredity versus cultural heritage in terms of defining the niche humans, as a biological species, occupy in the world.
Printing provided the means to replicate, store and distribute practical knowledge at a relatively low cost. Computerized word processing, networks and the Internet are enabled by the microelectronic revolution, and represent vastly important tools for capturing, replicating and distributing ever larger volumes of knowledge extraordinarily faster. Here, Robertson (1998) estimates that "computers" now allow individual humans to create and control 1025 bits of information - or 8 orders of magnitude more information than even cheap printing gave people access to (and this is still growing exponentially).
Because this technology to automate knowledge has changed so radically within single individuals’ lifetimes, many people find it difficult to come to terms with the shift in this last revolution from tangible paper to the intangible aspects of electronic information storage, management, display and distribution. The exponentially growing rapidity of change seems revolutionary to many. However, as long as World 3 knowledge is still being captured and distributed in the form of “documents”, I would argue that these changes are not conceptually revolutionary in the sense that they will spark a grade shift qualitatively different from that of the continuing Industrial Revolution. From my point of view, true knowledge automation dates from the establishment of the Standard Generalised Markup Language (SGML) in 1986 (Goldfarb 1996). I will elaborate these themes below in the Episodes following the Counter Subject.
The Microelectronics Revolution in hardware has combined with knowledge automation to become the Internet Revolution (Zakon 2002). The result has given people access to more knowledge than individuals can comprehend using their native cognitive abilities.
The real revolution, which is changing the nature of humanity more fundamentally than any of the previoius revolutions and grade shifts, is the development of technology able to manage knowledge extrinsically to the human mind36. However, before examining this latest revolution in detail, it will be helpful to consider in the Counter Subject the cognitive dimensions of data, information and knowledge in human affairs.