Brian Arthur’s thesis on the evolution of technology in his book The Nature of Technology (with thanks to Rick Munoz for the gift) dovetails so nicely with my broader paradigm of human social institutional ecology, addressing precisely that aspect which I had mostly left to the side (see, e.g., The Politics of Consciousness, in which I identify “social institutional and technological regimes” as the paradigms into which evolving memes aggregate, but focus on social institutions and ideologies), that this post is largely a synopsis of Arthur’s ideas, extended into and blended with “my own” marginal contribution. (The book is well worth reading; my summary here does not do it justice).
In brief, Arthur’s thesis is that technologies, which are essentially “programmed” natural phenomena, are comprised of assemblies and components, and subassemblies and subcomponents, down to an elemental level, with constant marginal modifications and recombinations of subcomponents, creating technological domains (e.g., digital, electronic, genetic, etc.), thus evolving within the context of these technological ecosystems (an idea I began to address before reading Arthur’s book, in The Evolutionary Ecology of Audio-Visual Entertainment (& the nested & overlapping subsystems of Gaia), Information and Energy: Past, Present, and Future, and The Nature-Mind-Machine Matrix). The entire corpus of technology, in articulation with the evolving economy and legal system, evolves as well, causing cascades of destruction of linkages to technologies made obsolete by innovations, and cascades of new technologies made possible or necessary by other recent innovations.
The key to Arthur’s paradigm is that technologies are purposive programmings of natural phenomena (including human behavioral phenomena), and so both include (along with what is more conventionally visualized as “technology”) those social institutional innovations that are purposive (e.g., currency instruments) and exclude anything that developed haphazardly (e.g., informal social norms), though they both coevolve, adapting to one another. Technological evolution differs from Darwinian biological evolution primarily in the fact that new “species” (i.e., inventions) do not emerge merely as the result of an accretion of incremental changes selected by virtue of their relative reproductive success, but also by virtue of rather sudden new configurations of old technologies, and applications of new principles to old challenges. But these novel forms, whether the small increments of engineers making new applications of old technologies to solve novel problems, or the larger innovations of inventors utilizing new principles to address new or old challenges, are then subjected to that same Darwinian lathe.
Some of the distinguishing characteristics of technologies are that they are recursive (they are comprised of components that are themselves technologies, which in turn are comprised of components that are in turn technologies), modular (comprised of main assemblies performing main functions and subassemblies performing auxiliary functions), programmings of natural phenomena, and constantly evolving from earlier forms, midwifed by human ingenuity, but generated, in a sense, by earlier innovations. Each problem confronted implicates both backward and forward linkages, affecting the components of the technology worked with, and the possibilities with which new problems can be addressed.
Technologies form a kind of language within their domain, which practitioners draw on the way a composer or author draws on the musical or written language that is their medium, expressing a desired objective through recourse to the known phrases and grammars of those languages. It develops according to a combinatorial evolution, with something that developed in another domain for another reason available to those who recognize a novel use for it elsewhere. The memes of technological evolution are free radicals, able to attach to any other group of memes where they may have a particular basis for thriving.
Technology evolves in tandem with science, both the means of scientific discovery (the instruments used) and informed by science (finding the principles on which to base technological advances).
Technology evolves from few to many, from simple to complex, beginning with direct exploitations of natural phenomena (fire, sharp objects, etc.), and growing on the possibilities created by their exploitation, with new technologies and technological domains opening up new opportunities for yet more innovations. This is not unlike the evolution of biological and social institutional forms, which evolved from a single cell into the plethora of life now on Earth, and from more or less homogeneous primate cultures to the great variation of human cultures generated by geographic dispersion and differentiation.
Nor is the winnowing out process particularly different, in which some technologies (species, cultures) become dominant and widespread, eclipsing others, sometimes even eliminating them all together, forming distinct branches where an undifferentiated continuum would otherwise have been.
The processes of innovation rippling through the system (by posing new problems and creating new opportunities, by requiring new auxiliary assemblies, by rendering old ones obsolete, and the linkages that depended wholly on them obsolete as well), sweeping up economic and legal structures with it (creating new needs for new infrastructure, new forms of organization, new legal contexts, etc., while rendering others obsolete and archaic), includes a variety of stages, such as “standard engineering” (adapting an existing technology to varying contexts), adding on (improving performance and addressing problems by tacking on new subsystems), reaching limits and being faced with needs (trying to capture new potentialities that would require some improvement that current technologies can’t yet provide, and seeking a new principle to exploit to provide it), and undergoing a paradigm shift as a result (creating a new technology, that then sets in motion all of the rippling changes new technologies set into motion).
What does this mean for public policy? Public policy is, essentially, the attempt to establish and implement social institutional technologies, based on principles of human behavioral phenomena. From the haphazardly accumulated mass of social institutional materials, the challenge is to find components and assemblies that are usable, to combine and recombine them in fluent ways, in pursuit of specific objectives. One example would be what I have called “Political Market Instruments” (see Deforestation: Losing an Area the Size of England Every Year), which simply adapt the combined technologies of market exchange and regulatory oversight to the goal of increasing the production of a public good or decreasing the production of a public bad. It is an excellent example of Arthur’s modularity in action, since it is the integration of technologies that had not previously been so combined.
Some examples of social institutional technologies and how they combine include Democracy, the U.S. Constitution, and corporate business organization, resulting in, among other things, constitutionally protected massive funding for commercial-saturated campaign cycles. Many would argue that new technologies are demanded by the problems created through this combination of old ones. Another example is the borrowing from markets to combine its principles to public education in the form of vouchers. These examples point to the fact that while we gain much from our technologies, we also create new problems with them, and need to pick and choose how and when to implement them, always in service to a vision of how to forge our way into the future most in service to human well-being in the fullest sense.
Human social institutional and technological evolution is not something that occurs exclusively ”in” the human mind, via the differentially successful reproduction of memes and their aggregation into paradigms (shifting in response to accumulations of anomalies). At least in regards to successful purposive systems, the natural phenomena upon which those memes and paradigms are working are in some ways (as Arthur points out) more the “genetic material” of those evolving forms than the packets of information working them. The programmed phenomena themselves form the alphabet and vocabulary of technological innovation, which the memes order into a grammar.
An example of an obvious human behavioral phenomenon on which the social institutional technologies of markets draw is: People will exchange what they have for something they value more highly. Another one, which allows the shift from barter to currency, is: People will recognize some fungible and generally fairly compact thing of agreed upon value, in large enough supply to serve the purpose but small enough supply to retain its value, as a medium of exchange. Many such social institutional technologies exist, based on how we respond to potential costs and benefits (including hierarchically imposed rewards and punishments and diffusely imposed social approval and disapproval), how we internalize values, and so on. The need to base social policies on an understanding of these phenomena is critical.
But, in a sense, there are two interwoven currents in our social institutional evolutionary ecology: The evolution of technologies (“purposive systems”), including social institutional technologies, and the haphazard maelstrom of psychologically and emotionally (rather than social systemically and economically) motivated reactions to it. The distinction is similar to the natural landscape around us, from which we have sculpted some architectures of our own. (Both, it might be argued, are evolutionary ecologies, and bear some of the characteristics described by Arthur, since even the haphazardly evolving social institutional landscape can borrow from other cultures or social institutional milieu and combine forms in new ways).
The purposeful and utlilitarian stream is characterized by a relatively high signal-to-noise ratio (see The Signal-To-Noise Ratio), utilizing the grammar of various domains relatively fluently. The psychologically and emotionally unreflective reactions to it are characterized by a relatively low signal-to-noise ratio, speaking internal languages whose correspondence to external reality is less disciplined (see Ideology v. Methodology). Technologies correspond to scientific and legal methodologies, while the evolutionary currents around them correspond to collections of arbitrary or unreflectively formed beliefs and rituals. The latter evolve as well, and may serve many human needs, but with less precision and reliability.
To be sure, sometimes technologies are quite toxic, and cultural rituals are quite benign. But the toxicity of the former can not be nullified by the benign qualities of the latter: It can only be addressed through another purposeful system, another technology, designed with the intention of addressing it. When there is a purpose beyond the inherent value of the thing itself, an architecture is required (such as shelter from the elements); when there is no purpose beyond that inherent value (such as a conversation with a friend or a party), no architecture beyond that which facilitates the event is required.
So the purposeful processes by which technologies emerge and develop, particularly social institutional technologies, and particularly those mediated by government action, slog through the viscous resistance of emotionally and psychologically motivated beliefs and rituals, bludgeoned by Luddites and chased by torch-bearing mobs. The progress of human consciousness (including that portion designed to address the problems caused by other products of the same process) is thus encumbered by those clinging to some sacred tradition and determined to tether all humanity to it.
The result is not stagnation, since change is constant. It is not an avoidance of the pitfalls and dangers of progress, but rather a blindfolding of it, an assurance that though forward progress will be slower and clumsier, it will also more certainly and more heavily be laden with the catastrophes of self-destruction that are inherent to stumbling down unexamined and danger-strewn paths.
Negotiating this evolving ecosystem of social institutions, technologies, and their interactions with both individuals and the natural environment involves more than hammering together a set of purposive systems. It is a vibrant whole, a metabolism, more organic than mechanistic. Understanding how it flows, how changes ripple through it, how its complexity and interconnectedness forms the roiling currents we are riding, is the ultimate art and science of consciously articulating our lives with their context in ways that allow us to fulfil potentials we have only barely begun to imagine. To some extent, these potentials will be realized by technologies, including social institutional technologies. But human consciousness is more than the sum of its parts, and the more our technologies and ideologies flow and undulate with the rhythms of the evolving natural, social institutional, and technological systems within which they are embedded, and with which they articulate, the more fully we will realize the full breadth and depth of our humanity.
Ironically, the haphazardly formed social institutional landscape from which technology carves out its architectures is approximated again in the ecology of that architecture itself. It is not the escape from that beautiful dance of chaos that holds the greatest promise for humanity, but rather the perfection of the art of dancing to its rhythms.