In 35 years of reading seriously and often professionally, I have never a read a book like What Technology Wants. I dog-eared at least 30 pages and filled several margins with reactions. Over two long plane rides, I was by turns absorbed, consternated, and counter-punching. I think What Technology Wants gets the story wrong, but it lays out a bold, original, and challenging position with a complex array of evidence, analysis, and conviction. The core hypothesis is untestable, however, and enough counterexamples can be summoned that substantial uncertainty undermines Kelly's deterministic argument.
Make no mistake, optimism is the operative motif. As Kelly notes, when sages or prophets foretold the future in ages past, the outlook was usually bad. The very notion of progress, by contrast, is itself a relatively modern invention. As we will see, Kelly's book is best understood as part of a larger conversation, one that has found particularly fertile ground in America.
What exactly is the technology that "wants" things? From the outset, Kelly finesses a sweepingly broad definition:
"I've somewhat reluctantly coined a word to designate the greater, global, massively interconnected system of technology vibrating around us. I call it the _technium_. The technium extends beyond shiny hardware to include culture, art, social institutions, and intellectual creations of all types. . . . And most important, it includes the generative impulses of our inventions to encourage more tool making, more technology invention, and more self-enhancing connections." (11-12)
Several of the book's key themes become apparent early. Most centrally, technology is read as, if not alive ("vibrating" with "impulses"), then something very close to alive: connections between technology and biology, moving in both directions, are drawn throughout the book. For example, "if I can demonstrate that there is an internally generated direction within natural evolution, then my argument that the technium extends this direction is easier to see." (119)
The second, and more regrettable, tendency of the book is to argue along multiple slippery slopes. In the initial definition, for example, the technium includes everything from churches (both buildings and people) to cloned sheep to George Foreman grills to the Internet. If it includes so much, what is the technium _not_? I believe that understanding "social institutions and intellectual creations of all types" and their role in the technology artifacts that more commonly concern us -- things like end-of-life treatment protocols, ever-nastier methods of warfare, or high levels of carbon dioxide output -- requires a sharper knife.
The aforementioned slippery slope argumentative technique may have been most brilliantly parodied in the student court trial scene in Animal House:
***
But you can't hold a whole fraternity responsible for the behavior of a few sick, perverted individuals. If you do, shouldn't we blame the whole fraternity system?
And if the whole fraternity system is guilty, then isn't this an indictment of our educational institutions in general?
I put it to you, Greg. Isn't this an indictment of our entire American society?
Well, you can do what you want to us, but we won't sit here, and listen to you badmouth the United States of America!
***
Several sections of What Technology Wants raised red flags that suggest similarly deft rhetoric may be in play elsewhere in the book. In an argument structurally very similar to the Animal House logic, for example, the technium is given almost literally biological properties: "Because the technium is an outgrowth of the human mind, it is also an outgrowth of life, and by extension it is also an outgrowth of the physical and chemical self-organization that first led to life." (15) If, like me, one does not grant him this chain of logic linking single-celled life forms to Ferraris or credit default swaps, Kelly's argument loses some of its momentum: for him, the quasi-sentient life force that is the sum of humanity's efforts to create is ultimately life-enhancing rather than destructive or even indifferent.
Nowhere is this faith more clearly stated than in the book's conclusion. "[The technium] contains more goodness than anything else we know," Kelly asserts. Given that the technium is everything that people have ever made or written down, what is the alternative that could be "more good"? Pure nature? But the technium is awfully close to nature too: "the technium's wants are those of life." In fact, like Soylent Green, the technium is (at least partially) people: "It will take the whole technium, and that includes us, to discover the tools that are needed to surprise the world." (359)
But the fact of the matter is that much of the technium is built to kill, not to want life: the role of warfare in the advancement of technology dates back millennia. From swords and plowshares, to Eli Whitney's concept of interchangeable parts in musket-making, to nuclear weapons, people and governments have long used technical innovation to subdue each other. Even Kelly's (and my) beloved Internet can trace its origins directly to the game theoretics of John von Neumann and mutual assured destruction. Statecraft shapes technology, sometimes decisively, yet this influence is buried in Kelly's avalanche of technological determinism.
To be sure, some of Kelly's optimism has convincing grounding; it's his teleology I question. In What Technology Wants, the strongest sections combined clever data-gathering and analysis to express the power of compounding innovation: particularly where they can get smaller, things rapidly become cheaper and more powerful at a rate never before witnessed. Microprocessors and DNA tools (both sequencing and synthesis) are essential technologies for the 21st century, with Moore's law-like trajectories of cost and performance. In addition, because software allows human creativity to express and replicate itself, the computer age can advance very rapidly indeed. The key question, however, relates less to technological progress than to our relation to that progress.
In my discussions with Kelly back when we were affiliated with the same think tank in the 1990s, he had already identified the Amish as a powerful resource for thinking about the adoption of technology. Chapter 11, on Amish hackers, raises the issues of selective rejection to a level of depth and nuance that I have seen nowhere else. Four principles govern the Amish, who are often surprising in their technology choices, as anyone who has seen their skilled and productive carpenters (with their pneumatic nail guns carried in the back of pickup trucks) can attest.
1) They are selective, ignoring far more than they adopt.
2) They evaluate new things by experience, in controlled trial scenarios.
3) Their criteria for evaluation are clear: enhance family and community while maintaining distance from the non-Amish world.
4) The choices are not individual but communal. (225-6)
Remarkably, Amish populations are growing (fast), unlike the Shakers of New England who attempted similar removal from the world but could not sustain their existence either individually or collectively. Instead, the Amish often become expert in the use of a technology while eschewing its ownership. They are clever hackers, admirable for their ability to fix things that many non-Amish would simply throw away. At the same time, there are no Amish doctors, and girls have precisely one career trajectory: motherhood or a close equivalent thereof. As Kelly notes, the people who staff and supply grocery stores or doctor's offices, participate in a cash economy, and pay taxes for roads and other infrastructure enable their retreat. In the end, the Amish stance cannot scale to the rest of us, in part because of their radical withdrawal from the world of television, cell phones, and automobiles, and because of the sect's cohesive religious ethos.
Speaking of governments and economies, the role of money and markets is also remarkably limited for Kelly. Technologies evolve through invention and innovation. Those processes occur within a lattice of investors, marketers, sales reps, and other businesspeople who have different motivations for getting technologies into people's hands or lives. Not all of these motives support the wants of life, as Bhopal, cigarette marketing, and Love Canal would attest.
The capitalist underpinnings beneath so much western technology are ignored, as in this summary passage: "Like personality, technology is shaped by a triad of forces. The primary driver is preordained development -- what technology wants. The second driver is the influence of technology history, the gravity of the past . . . . The third force is society's collective free will in shaping the technium, or our choices." (181)
Profit motives, lock-in/lock-out, and the psychology of wants and needs (along with business's attempts to engage it) are all on the sideline. Furthermore, a "collective free will" feels problematic: what exactly does that mean? Market forces? I don't think that reading is in play here. Rather than economics, Kelly seems most closely aligned with biology, to an extreme degree at some points: "The most helpful metaphor for understanding technology may be to consider humans as the parents of our technological children." (257)
But understanding ourselves as "parents" doesn't help solve real technological problems: how do we address billions of discarded plastic beverage bottles (many fouling the oceans), or the real costs of long-term adoption of the internal combustion engine, or the systems of food and crop subsidies and regulations that shape diet in a age of simultaneous starvation and obesity? How does the technium want goodness in any of those scenarios? Maybe the polity and the increasingly vibrant non-profit sector are part of the technology superstructure, seeing as they are human inventions, but if that's the case, Kelly's definition is so broad as to lose usefulness: the book gives little idea of what lies outside the technium. If money and markets (and kings and congresses, as well as missiles and machine guns) are coequal with cathedrals and computers, getting leverage on questions of how humans use, and are used by, our technologies becomes more difficult.
With all of its strengths and shortcomings, Kelly has written a book at once unique and rooted in a deep tradition: for well over a century Americans in particular have simultaneously worried and effused over their machines. The distinguished historian of technology Thomas P. Hughes noted in 1989 that the 1960s had given many technologies a bad name, so that cheerleaders had become scarce even as technology was infusing itself into the conceptual and indeed existential ground water: "Today technological enthusiasm, although much muted as compared with the 1920s, survives among engineers, managers, system builders, and others with vested interests in technological systems. The systems spawned by that enthusiasm, however, have acquired a momentum -- almost a life -- of their own." (American Genesis, 12) The technology-is-alive meme is a familiar one, and a whole other study could position Kelly in that tradition as well.
For our purposes, it is sufficient to note that Kelly stands as a descendant of such enthusiasts as Edison, Ford, Frederick W. Taylor, Vannevar Bush, and, perhaps most directly, Lewis Mumford, now most famous as an urban theorist. Like Kelly, Mumford simultaneously delighted in the wonders of his age while also seeing causes for concern. Note how closely his 1934 book Technics and Civilization anticipates Kelly, excepting the fact that Mumford predated the computer:
"When I use the word machines I shall refer to specific objects like the printing press or the power loom. When I use the term 'the machine' I shall employ it as a shorthand reference to the entire technological complex. This will embrace the knowledge and skills and arts derived from industry or implicated in the new technics, and will include various forms of tool, instrument, apparatus and utility as well as machines proper." (12)
One man's technium is another man's machine. For all their similarity of definition, however, Mumford kept human agency at the center of his ethos, compared to Kelly's talk of inevitability and other semi-biological tendencies of the technium super-system: "No matter how completely technics relies upon the objective procedures of the sciences, it does not form an independent system, like the universe: it exists as an element in human culture and it promises well or ill as the social groups that exploit it promise well or ill." (6) Mumford focuses on the tool-builder; Kelly gives primacy to the cumulative (and, he asserts, mostly beneficent) sum of their tool-building. In the end, however, that technium is a mass of human devices, institutions, and creations so sprawling that it loses conceptual usefulness since no human artifacts are excluded.
The critical difference between the two perspectives becomes clear as Mumford resists the same determinism in which Kelly revels: "In order to reconquer the machine and subdue it to human purposes, one must first understand it and assimilate it. So far, we have embraced the machine without fully understanding it, or, like the weaker romantics, we have rejected the machine without first seeing how much of it we could intelligently assimilate." (6) Mumford's goal -- consciously understanding and assimilating technologies within a cultivated human culture -- sounds remarkably like the Amish notion of selective rejection that Kelly admires yet ultimately rejects as impractical at scale.
It is a tribute to Kevin Kelly that he forced me to think so hard about these issues. What Technology wants deserves to be widely read and discussed, albeit with red pencils close at hand; it is a book to savor, to consider, to challenge, and to debate. The book is not linear by any stretch of the imagination, and strong chapters (such as on deep progress and on the Amish) sit alongside weaker discussions of technology-as-biology and an arbitrary grocery list of the technium's attributes that feels like it could have been handled less randomly.
Those shortcomings help define the book: by tackling a hard, messy topic, Kelly was bound to have tough patches of tentative prose, partially unsatisfying logic, and conclusions that will not be universally accepted. For having the intellectual courage to do so, I tip my hat. Meanwhile I look for a latter-day Lewis Mumford to restore human agency to the center of the argument while at the same time recognizing that governments, markets, and above all people interact with our technologies in a contingent, dynamic interplay that is anything but deterministic.